• 0 Posts
  • 10 Comments
Joined 3 years ago
cake
Cake day: July 1st, 2023

help-circle
  • Caught me. Was just an easy number to pull.

    But I’d argue that 2% is still something to look at. A 2% shortfall in power capacity still means you are looking at rolling blackouts to handle the demand/production mismatch. If power has to be rationed, then I’d much rather have an extra ~50k AC units running vs pretty lights for advertisements. Especially since load tends to peak during the day anyways. Shutting off the lights during the day makes sense.


  • cogman@lemmy.worldtoSocialism@lemmy.mlKnow the real culprits
    link
    fedilink
    arrow-up
    6
    arrow-down
    2
    ·
    11 hours ago

    It’s actually a bit silly to call lighting a “base load”. That’s not how the grid works. Base load is specifically talking about the grid itself and what the lowest load is on the grid. They don’t have an actuarial table where your refrigerator gets put into the base load bucket while your bathroom lights are put in the peak load bucket. It’s all one load.

    What power companies are looking at is the demand curve. The lowest level of the demand curve is the base load. That’s all it is.

    Things do get trickier with commercial power, especially when talking about machinery. But for something as simple as lighting it’s completely straight forward. Turning off 150MW of lights frees 150MW of peeker capacity which can be used for more useful things like boiling water in a data center to answer questions wrong (I kid).


  • cogman@lemmy.worldtoSocialism@lemmy.mlKnow the real culprits
    link
    fedilink
    arrow-up
    6
    arrow-down
    1
    ·
    edit-2
    9 hours ago

    Compressor startup is more intensive than lighting. Once the compressor is running it’s a pretty steady power consumption.

    A window unit, for example, on startup (assuming it doesn’t have a smooth start) will pull a full 20A. However, during operation it ultimately will pull around 5A.

    That said, there’s not some sort of special electrical budget which makes the lights in NYT come from baseload generators vs peakers. If those lights turned off, the total grid load would go down by the amount of power those lights consume. And, as it turns out, those lights are consuming around 150MW. That’s ~4 steel mills worth of heat just being shoved into the atmosphere for advertisement. It’s at least 1 powerplant’s worth of power.

    Shutting those lights off would take the coordination of something like 10 businesses vs telling the millions of residence of NY to adjust their power consumption. They absolutely would make a difference. It’s not like there isn’t still a base load of power needed with those lights off.

    Edit: My numbers are off, it’s closer to 35MW. ~1 steel mill worth


  • Hey, can we stop calling everything with a computer “AI”? Order management systems have been a thing long before LLMs were invented (I’ve worked on one). This was perhaps one of the first applications of computing. Humans hand writing an order form in a major grocery store hasn’t been a thing since like the 80s.

    Also, I’m like 80% sure this article was barfed out by an LLM. The em-dashes be everywhere.





  • I promise streaming services and CDNs employ world-class experts in encoding

    They don’t really care about the quality

    It’s funny that you are trying to make both these points at the same time.

    You don’t hire world class experts if you don’t care about quality.

    I have a hobby of doing re-encoding blurays to lower bitrates. And one thing that’s pretty obvious is the world class experts who wrote the encoders in the first place have them overly tuned to omit data from dark areas of a scene to avoid wasting bits in that location. This is true of H265, VP9, and AV1. You have to specifically tune those encoders to push the encoder to spend more of it’s bits on the dark area or you have to up the bitrate to absurd levels.

    Where these encoders spend the bitrate in dark scenes is on any areas of light within the scene. That works great if you are looking at something like a tree with a lot of dark patches, but it really messes with a single light person with darkness everywhere. It just so happens that it’s really easy to dump 2mbps on a torch in a hall and leave just 0.1mbps on the rest of the scene.

    That will unarguably provide a drastically worse experience on a high-enough quality tv than a 40mbps+ bluray. Like, day and night in most scenes and even more in others.

    I can tell you that this is simply false. And it’s the same psuedo-scientific logic that someone trying to sell gold plated cables and FLAC encodings pushes.

    Look, beyond just the darkness tuning problem that streaming services have, the other problem they have is a QOS. The way content is encoded for streaming just isn’t ideal. When you say “they have to hit 14mpbs” the fact is that they are forcing themselves to do 14mbps throughout the entire video. The reason they do this is because they want to limit buffering as much as possible. It’s a lot better experience to lower your resolution because you are constantly buffering. But that action makes it really hard to do good video optimizations on the encoder. Ever second of the video they are burning 14mb whether they need those 14mb or not. The way that’d deliver less data would be if they only averaged 14mbps rather than forcing it throughout. Allowing for 40mbps bursts when needed but then pushing everything else out at 1mbps saves on bandwidth. However, the end user doesn’t know that the reason they just started buffering is because a high motion action scene is coming up (and netflix doesn’t want to buffer for more than a few minutes).

    The other point I’d make is that streaming companies simply have a pipeline that they shove all video through. And, because it’s so generalized, these sorts of tradeoffs which make stuff look like a blocky mess happen. Sometimes that blocky mess is present in the source material (The streaming services aren’t ripping the blurays themselves, they get it from the content providers who aren’t necessarily sending in raws).

    I say all this because you can absolutely get 4k and 1080p looking good at sub-bluray rates. I have a library filled with these re-encodes that look great because of my experience here. A decent amount of HD media can be encoded at 1 or 2mbps and look great. But you have to make tradeoffs that streaming companies won’t make.

    For the record, the way I do my encoding is a scene by scene encode using VMAF to adjust the quality rate with some custom software I built to do just that. I target a 95% VMAF which ends up looking just fantastic across media.