

Then they’ll lobby against public WiFi. I was in China recently and (depending on the province) you need a phone number to access public WiFi so that they know who you are.
Then they’ll lobby against public WiFi. I was in China recently and (depending on the province) you need a phone number to access public WiFi so that they know who you are.
If the bottom can be seen at all that is completely unacceptable.
If a ceiling fan is on one of my boys will hide under chairs and couches and basically just skirt around from cover to cover. I think he thinks it is a bird.
I also don’t mind if they are “selling” nothing, or just a supporter icon. As long as they are transparent that that is all you are getting.
I’m struggling to see how this actually made money. Because presumably the customer is paying for the delivery (as well as the food that was never ordered). So the fraudsters would just be paying themselves in a complicated way. My best guess is one of the following:
This article really keeps getting better and better.
This is what I moved to after Gandi started becoming shit and I have nothing bad to say about them yet.
Huh?
I’ve used Vim for a decade and I would be offended if it made any noise.
You can consider yourself whatever you want for however long you want.
If you feel young and people thing you are weird for saying so that is their problem. Young is a feeling not a number.
I’ve been using nginx forever. It works, I can do almost everything I want, even if more complex things sometimes require some contortions. I’m not sure I would pick it again if starting from scratch, but I have no problems that are worth switching for.
It would be wasteful to upload the full size image only to throw most of it away. JPEG compression is very cheap, especially at low resolutions (I assume that image search uses a pretty low-resolution source image). Doing it this way is actually what I would do for best user experience. (Not saying that they aren’t doing other malicious things, but doing the resizing on the client is actually a good idea)
In fact the top one has more crop.
#1 items should be backups. (Well maybe #2 so that you have something to back up, but don’t delete the source data until the backups are running.)
You need offsite backups, and ideally multiple locations.
This is not funny, it is mildly infuriating.
I’m pretty sure every microwave just splits the input in to the last to digits as a number of seconds and the digits before that as minutes. Then runs for 60 * minutes + seconds
. So 0:99 is equivalent to 1:39 and 1:80 is equivalent to 2:20. I mean it is a little weird that the seconds can be >59 and extra weird that you can do 6:66 but it isn’t exactly wizardry.
“big asf” hurts. It should be “big AF”, “big as fuck” or I’ll even allow “big as F”, but pick a lane.
there will be scaling with all of its negative consequences on perceived quality
In theory this is true. If you had a nice high-bitrate 1080p video it may look better on a 1080 display than any quality of 1440p video would due to loss while scaling. But in almost all cases selecting higher resolutions will provide better perceived quality due to the higher bitrate, even if they aren’t integer multiples of the displayed size.
It will also be more bandwidth efficient to target the output size directly. But streaming services want to keep the number of different versions small. Often this will already be >4 resolutions and 2-3 codecs. If they wanted to also have low/medium/high for each resolution that would be a significant cost (encoding itself, storage and reduction in cache hits). So they sort of squish the resolution and quality together into one scale, so 1080p isn’t just 1080p it also serves as a general “medium” quality. If you want “high” you need to go to 1440p or 2160p even if your output is only 1080.
the reason no one posts the bitrates is because it’s not exactly interesting information for the the general population.
But they post resolutions, which are arguably less interesting. The “general public” has been taught to use resolution as a proxy of quality. For TVs and other screens this is mostly true, but for video it isn’t the best metric (lossless video aside).
Bitrate is probably a better metric but even then it isn’t great. Different codecs and encoding settings can result in much better quality at the same bitrate. But I think in most cases it correlates better with quality than resolution does.
The ideal metric would probably be some sort of actual quality metric, but none of these are perfect either. Maybe we should just go back to Low/Med/High for quality descriptions.
The biggest question is this a fork or a threek?