He’s very good.

  • 1 Post
  • 82 Comments
Joined 1 year ago
cake
Cake day: June 20th, 2023

help-circle









  • Video encoding has several tradeoffs:

    • Bitrate
    • Resolution/frame rate
    • Perceived quality
    • Computational complexity of encoding
    • Computational complexity of decoding

    The cell phone encoding chips for video encoding on device make sacrifices to preserve speed of encoding and preserve battery life (higher computational complexity costs more processing cycles and tends to use more power). So it’s simpler encoding, in exchange for inefficient bitrate compression.

    YouTube (and all the social media sites) have huge server farms with highly specialized encoding chips for making the videos more efficient with bitrate for quality. That makes sense because videos designed to be watched millions of times could benefit from even a very slight improvement of bitrate in exchange for a one-time cost of complex encoding. It’s also why YouTube tends not to convert to AV1 (very efficient in bitrate for quality, but computationally complex to encode) until a video has a few hundred views, because it’s not clear whether that tradeoff is worth it until they know a lot of people will be watching it.

    Netflix customizes even further for a per-video basis and looks for even more specialized tricks on a scene-by-scene basis, because every single one of its videos only needs to be encoded once for each quality/format but will be watched millions of times.

    In other words, it’s like any other engineering problem. The engineers choose different tradeoffs based on context, which means that the cell phone applies a different set of tradeoffs compared to the social media site’s server farm.


  • This comment basically demonstrates the weakness of these AI driven summarizes in their current state. It doesn’t tell who is who or why each fact offered is relevant to the larger story. A good summary strips out the details but preserves the high level summary information, while giving context as necessary. This generated summary kinda does the opposite, by going down a largely irrelevant rabbit hole of how he was caught, and who he was affiliated with.

    The real, actual TL;DR:

    Cameron Ortis, former intelligence chief of the Royal Canadian Mounted Police, has been convicted of leaking state secrets to three foreigners and attempting to leak state secrets to a fourth.

    Ortis did not deny leaking the secrets but raised a defense that the leaks were part of a legitimate intelligence operation, and that he was leaking the secrets to entice foreign subjects into using communications platforms monitored by Canadian intelligence and its “Five Eyes” partners (intelligence agencies of the US, UK, Australia, New Zealand). The operators of those platforms deny working with western intelligence.




  • I would choose a larger screen over that marginal difference in dpi every day of the week.

    Yes, but you’re not addressing my point that the price for the hardware isn’t actually bad, and that people who complain would often just prefer to buy hardware with lower specs for a lower price.

    The simple fact is that if you were to try to build a MacBook killer and try to compete on Apple’s own turf by matching specs, you’d find that the entry level Apple devices are basically the same price as other laptops you could configure with similar specs, because Apple’s baseline/entry level has a pretty powerful CPU/GPU and high resolution displays. So the appropriate response is not that they overcharge for what they give, but that they make choices that are more expensive for the consumer, which is a subtle difference that I’ve been trying to explain throughout this thread.

    You cannot compare an app that runs on two different OS.

    Why not? Half of the software I use is available on both Linux and MacOS, and frankly a substantial amount of what most people do is in browser anyway. If the software runs better on one device over another, that’s a real world difference that can be measured. If you’d prefer to use Passmark or whatever other benchmark you’d like you use, you’ll still see be able to compare specific CPUs.


  • This is a £1400 laptop from scan V’s £1500 macbook air currently.

    Ah, I see where some of the disconnect is. I’m comparing U.S. prices, where identical Apple hardware is significantly cheaper (that 15" Macbook Air starts at $1300 in the U.S., or £1058).

    And I can’t help but notice you’ve chosen a laptop with a worse screen (larger panel with lower resolution). Like I said, once you actually start looking at High DPI screens on laptops you’ll find that Apple’s prices are actually pretty cheap. 15 inch laptops with at least 2600 pixels of horizontal resolution generally start at higher prices. It’s fair to say you don’t need that kind of screen resolution, but the price for a device with those specs is going to be higher.

    The CPU benchmarks on that laptop’s CPU are also slightly behind the 15" Macbook Air, too, even held back by not having fans for managing thermals.

    There’s a huge market for new computers that have lower prices and lower performance than Apple’s cheapest models. That doesn’t mean that Apple’s cheapest models are a bad price for what they are, as Dell and Lenovo have plenty of models that are roughly around Apple’s price range, unless and until you start adding memory and storage. Thus, the backwards engineered pricing formula is that it’s a pretty low price for the CPU/GPU, and a very high price for the Storage/Memory.

    All of the PC components can be upgraded at the cost of the part + labour.

    Well, that’s becoming less common. Lots of motherboards are now relying on soldered RAM, and a few have started relying on soldered SSDs, too.


  • Except the boot process on a non apple PC is open software.

    For the most part, it isn’t. The typical laptop you buy from the major manufacturers (Lenovo, HP, Dell) have closed-source firmware. They all end up supporting the open UEFI standard, but the implementation is usually closed source. Having the ability to flash new firmware that is mostly open source but with closed source binary blobs (like coreboot) or fully open source (like libreboot) gets closer to the hardware at startup, but still sits on proprietary implementations.

    There’s some movement to open source more and more of this process, but it’s not quite there yet. AMD has the OpenSIL project and has publicly committed to open sourcing a functional firmware for those chips by 2026.

    Asahi uses the open source m1n1 bootloader to load a U-boot to load desktop Linux bootloaders like GRUB (which generally expect UEFI compatibility), as described here:

    • The SecureROM inside the M1 SoC starts up on cold boot, and loads iBoot1 from NOR flash
    • iBoot1 reads the boot configuration in the internal SSD, validates the system boot policy, and chooses an “OS” to boot – for us, Asahi Linux / m1n1 will look like an OS partition to iBoot1.
    • iBoot2, which is the “OS loader” and needs to reside in the OS partition being booted to, loads firmware for internal devices, sets up the Apple Device Tree, and boots a Mach-O kernel (or in our case, m1n1).
    • m1n1 parses the ADT, sets up more devices and makes things Linux-like, sets up an FDT (Flattened Device Tree, the binary devicetree format), then boots U-Boot.
    • U-Boot, which will have drivers for the internal SSD, reads its configuration and the next stage, and provides UEFI services – including forwarding the devicetree from m1n1.
    • GRUB, booting as a standard UEFI application from a disk partition, works like GRUB on any PC. This is what allows distributions to manage kernels the way we are used to, with grub-mkconfig and /etc/default/grub and friends.
    • Finally, the Linux kernel is booted, with the devicetree that was passed all the way from m1n1 providing it with the information it needs to work.

    If you compare the role of iBoot (proprietary Apple code) to the closed source firmware in the typical Dell/HP/Acer/Asus/Lenovo booting Linux, you’ll see that it’s basically just line drawing at a slightly later stage, where closed-source code hands off to open-source code. No matter how you slice it, it’s not virtualization, unless you want to take the position that most laptops can only run virtualized OSes.

    I think you mean that Apple uses its own memory more effectively then a windows PC does.

    No, I mean that when you spec out a base model Macbook Air at $1,199 and compare to similarly specced Windows laptops, whose CPUs/GPUs can deliver comparable performance on benchmarks, and a similar quality display built into the laptop, the Macbook Air is usually cheaper. The Windows laptops tend to become cheaper when you’re comparing Apple to non-Apple at higher memory and storage (roughly 16GB/1TB), but the base model Macbooks do compare favorably on price.


  • Can you run that outside of a virtual box?

    It’s not virtualization. It’s actually booted and runs on bare metal, same as the way Windows runs on a normal Windows computer: a proprietary closed UEFI firmware handles the boot process but boots an OS from the “hard drive” portion of non-volatile storage (usually an SSD on Windows machines). Whether you run Linux or Windows, that boot process starts the same.

    Asahi Linux is configured so that Apple’s firmware loads a Linux bootloader instead of booting MacOS.

    And wouldn’t it be a lot cheaper to just build your own PC rather than pay the premium for the apple logo?

    Apple’s base configurations are generally cheaper than similarly specced competitors, because their CPU/GPUs are so much cheaper than similar Intel/AMD/Nvidia chips. The expense comes from exorbitant prices for additional memory or storage, and the fact that they simply refuse to use cheaper display tech even in their cheapest laptops. The entry level laptop has a 13 inch 2560x1600 screen, which compares favorably to the highest end displays available on Thinkpads and Dells.

    If you’re already going to buy a laptop with a high quality HiDPI display, and are looking for high performance from your CPU/GPU, it takes a decent amount of storage/memory for a Macbook to overtake a similarly specced competitor in price.



  • As long as no one is getting hurt I don’t really see the problem.

    It’d be hard to actually meet that premise, though. People are getting hurt.

    Child abuse imagery is used as both a currency within those circles to incentivize additional distribution, which means there is a demand for ongoing and new actual abuse of victims. Extending that financial/economic analogy, seeding that economy with liquidity, in a financial sense, might or might not incentivize the creation of new authentic child abuse imagery (that requires a child victim to create). That’s not as clear, but what is clear is that it would reduce the transaction costs of distributing existing child abuse imagery, which is a form of re-victimizing those who have already been abused.

    Child abuse imagery is also used as a grooming technique. Normalization of child sexual activity is how a lot of abusers persuade children to engage in sexual acts. Providing victimless “seed” material might still result in actual abuse happening down the line.

    If the creation of AI-generated child abuse imagery begins to give actual abusers and users of real child abuse imagery cover, to where it becomes more difficult to investigate the crime or secure convictions against child rapists, then the proliferation of this technology would make it easier to victimize additional children without consequences.

    I’m not sure what the latest research is on the extent to which viewing and consuming child porn would lead to harmful behavior down the line (on the one hand, maybe it’s a less harmless outlet for unhealthy urges, but on the other hand, it may feed an addictive cycle that results in net additional harm to society).

    I’m sure there are a lot of other considerations and social forces at play, too.