It is already supported by most desktop browsers (but needs to be enabled), but not mobile browsers so much yet I believe, so perhaps one to consider for the future (or perhaps only implemented for the largest size “original” image). JPG XL looks like the best option here since it can be freely converted back to JPG again without any quality loss. Beyond that, the next big gain I think could come from implementing a more modern format. I would say iNaturalist does a better job of getting the balance right than many other web services do, though there is always room for improvement.Īpart from using a sliding-scale compression setting, other things they could look at would be whether they’re using a modern compression algorithm like MozJPG already and fine tuning the parameters within that to be best suited to wildlife images. For smaller cropped images, you want less compression applied where every pixel may count. the lower the JPG quality setting is set). In general I would say it is better to increase the JPG compression rather than decreasing the resolution, but ideally this should be on a sliding scale such that the higher resolution, the more compression is applied (i.e. And resolution is only a part of the image quality choice, along with the compression algorithm and particular settings used. It is the trade-off between image quality, and the resulting file-size that matters. Keep in mind that it is not resolution per se that we care about. Making data access faster by deleting a lot of the data is not a good solution. With regard to download times for people with slower internet connections, a user setting along the lines of “display maximum images at … x … pixels” would probably be a good solution. And since no one can download full resolution images now… who would be harmed? I’m sure some people would be irritated, but if the cost of server space and bandwidth is the limiting factor, get people to pay for it. Or maybe make access to full resolution images a paid feature. And if I’m going through observations rom other users, it’s a lot of clicks just to tell if there are larger versions. Yes, I can get to it from iNaturalist as-is, but it’s a lot of clicks and page loads for each image. For instance, if an image is already on flickr… why not use the version hosted there? When I click on an image to make it bigger, just show me the full resolution version that’s already hosted. Of course, I understand that server space and bandwidth aren’t free. It was a bit disappointing to realize that iNaturalist is deliberately deleting a lot of the information that I’m squinting to try to make out. I haven't set a buffer size to 4096 for 10 years when I had a 6600 quad core.A lot of my time on iNaturalist is spent zooming in on photos in a separate browser window to try to make out barely visible or entirely invisible features. I'm doing projects with 100s of tracks and my buffer size never goes above 64 unless I use a lot of plugins and then it is never higher than 128. Thanks for the help.What kind of sound card do you have now? usb? what hard disk are you using? ssd?ĩ times out of 10 buffer issues are from slow harddisk and usb interfcesĪ good PCI audio card with an SSD drive can't be overrun. I'm investing in an interface and a new/faster cpu in a few months which will solve my issue, but in the interim I'm looking for a fix to hold me over.Īre there any asio-like virtual devices that allow 4096 smp, for example? Currently 2048 smp is the maximum that ASIO (which I am currently using) will allow. I'm looking for a way to get a larger buffer size than 2048 (47ms) so I can listen to my playback without underruns. I use a TON of VERY cpu intensive plugins when mixing.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |