• 0 Posts
  • 79 Comments
Joined 1 year ago
cake
Cake day: June 19th, 2023

help-circle

  • I think Apple made a serious miss-calculation there. If they’re being honest, and removed web apps because they are technically difficult to implement, they should have said something along the lines of “we are working on this and will disable it temporarily to avoid penalties”.

    But I suspect it’s got nothing to do with that. Web apps can run native code with WASM and it would only be a matter of time before someone (google?) releases a “browser” that allows you to run native Android apps. Or worse, native iPhone apps… bypassing Apple’s Core Technology fee since it’s “just a webpage”.


  • That way all it will take to infect my parents’ phones with malware capable of scraping copious amounts of my data will be normal phone usage that Apple can’t protect against!

    Um… What? iPhone apps run in a sandbox. They can’t access anything. They can’t even run at all unless the user launches the app or interacts with a notification. Background running is strictly limited to things like music playback with very few exceptions (exceptions which are taken away if the user never launches the app).

    And for the record, I don’t own an android phone and never have.



  • It’s not a civil matter. The Digital Markets Act requires the EU to monitor competition and that means they will rely on input from other competitors in the same space. The DMA gives the EU powers that no private company has including the ability to issue specific directives to individual companies.

    Those directives could range from “remove this clause from you app store contract” to “stop selling phones in Europe”. No civil court case is ever going to have an outcome like that.


  • The NSA is apparently collecting and storing thirty petabytes of encrypted data per day and that’s likely to include every iMessage sent worldwide. When quantum computers arrive, they will be able to decrypt that data and some estimates put that future uncomfortably close. Real breakthroughs in quantum computing have been made the last year or two.

    It’s good to see something is finally being done about that threat. I wouldn’t count on the NSA being the only people with the data either - it’s a goldmine and surely other governments are trying to gain access.



  • They tried a new product that they have never used before, they decided they didn’t like it enough to pay thousands of dollars, so they returned it. Sounds like a perfectly reasonable thing to do if you ask me. How are they morons?

    Personally I’m typing this comment on an ordinary LCD display that far higher “pixels per degree” than Apple Vision Pro. It’s not even close, like more than double as many pixels per degree… Which means using the headset would be a significant step down in display quality for me.

    Sure - for 3D content Vision Pro would be vastly superior, but I almost never work with 3D content. I just want to read (and write) text. Vision Pro clearly isn’t a product for me until it has higher resolution displays and a wider field of view and that’s perfectly fine, I can wait for that day. For other people it’s less clear wether or not it’s a good product for them and I think those people should absolutely try it out to see if it works instead of trying to guess having never used one. And if they don’t like it, return it. That’s what return policies are for.



  • abhibeckert@lemmy.worldtoApple@lemmy.worldiFixit's Vision Pro Teardown
    link
    fedilink
    arrow-up
    4
    arrow-down
    1
    ·
    edit-2
    8 months ago

    This is probably the three layers (three are physical, the rest are digital processing steps):

    The “generate faces” step is well in the uncanny valley territory. There’s not enough light inside the headset to do passthrough video so instead they have a very primitive avatar which is recognisably you but definitely doesn’t look like you.

    Personally I don’t think any of them contribute significantly to the weight of the headset. Probably less than 1% of the weight.


  • Sure, Masimo filed a bunch of invalid patents… but the two that held up seem pretty valid to me.

    They’re not “the same thing but on a computer!”. They are “the same thing but you don’t have to stick a needle in someone, fill a vial with their blood, send the blood off to a laboratory, and wait for them to send you the results”. Real time non-invasive chemical analysis of blood is a genuine invention and Masimo has been leading the world in that for a long time.

    I hate patents. I think the world would be a better place without them… but like it or not they exist. And Apple owns a lot of them… so I’m not going to be sympathetic in the slightest when they find themselves on the wrong end of a patent lawsuit.



  • I wouldn’t use a hub or install any software - that tends to be unreliable (Dell does have drivers for Macs, and they do enable some extra features, but I wouldn’t install those drivers - it’s mostly just software control of things you can easily do withe the hardware buttons - the risk/reward isn’t worth it as those drivers might not work with a future version of MacOS).

    The monitors I recommended all have USB-C input, so you can just use just an ordinary* USB cable will do it. They will also charge your Mac’s battery while connected, and you’ll be able to plug other things (wired headphones, ethernet, usb drives, etc) into the monitor. The 27" ones can also be daisy chained - connect the second into the other, rather than directly to your Mac.

    (* an ordinary USB cable that can handle the bandwidth that is. DisplayPort needs about 32Gbps. If you want a long cable it’ll be expensive, but short high bandwidth cables are affordable)

    If you want four monitors, with your specific GPU the fourth one has to be HDMI. When you have multiple options - generally HDMI is less reliable than DisplayPort, and DisplayPort is less reliable than USB-C.

    Ultimately you just need to plug it in and find out. Not all monitors are compatible with all GPUs - there’s a lengthy list of optional features in the specification and almost no hardware supports all of those features. Sometimes those optional features are required to have a good experience for example you might need “Display Stream Compression” in order to be able to connect an ethernet cable to the display or worse it might be needed to run at 60hz. Sometimes playing DRM’d content can be an issue too, services like Netflix require an encrypted connection to the monitor.


  • For me size matters more than quantity. A lot of modern software is designed to have windows at least 24 inches wide and at that width you can’t really fit four monitors on a desk. At 100 inches (horizontal not diagonal) you need to turn or roll your chair to see some of your windows.

    So - at least one really big/wide monitor is the way to go in my opinion.

    For that one I’d go with a Dell U4021QW - that’s a 40" ultra wide that will comfortably support three or four large windows and more critically a single window with a bunch of internal sections. You really do need a curve at that size otherwise the left/right edge of the monitor is too far away.

    You can also configure it as multiple virtual monitors, if you want for example the left half to be one computer and the right half to be another computer… or if your software just works better with multiple monitors you can make it into two monitors without any bezel separating them…

    While it’s nice and wide, it’s not super tall… but I actually don’t mind that. You need somewhere on your desk to put a glass of water, phone, etc without being in front of your monitor.

    To get some height I’d go with two regular aspect ratio (and flat) 27" panels in portrait mode. Personally I’d go with Dell again - specifically the U2723QE which is a reasonably priced (for the quality) monitor with very small bezels. Rotated sideways and touching your desk at the bottom, it will be about as tall as you would ever want a monitor to be.

    The middle panel will be 140ppi and the side panels 160ppi. That is not “retina” at laptop viewing distance (with the panel right next to your keyboard) but it is retina at a comfortable viewing distance for a monitor that large (at least 24in/60cm or so - personally mine is further away than that).

    Finally, I recommend buying VESA arm mounts. Multi-arm mounts usually don’t work well with a 40" panel (it might “work” but only if the panel is low to your desk) - so you’re probably going to have to buy three single arms. It will free up loads of desk space (and you’re going to be low on desk space), give you more flexibility to get the position/height/angle just right, and also be more stable than the stand that comes with any of those monitors.

    Make sure the one for your 40" panel in particular can handle that much weight. I’d go for one that is advertised as handling quite a bit more weight than what you’re putting on it. Less of an issue for the side monitors since I’d have those resting on your desk.


  • For example - Apple keyboards have an emoji button, a power button, and cryptographically secure* biometrics. All OS specific.

    (* the fingerprint scanner isn’t just OS specific, it requires matching hardware on the motherboard of your computer - the keyboard just has a dumb sensor and it’s your Mac that actually checks the fingerprint… and it happens in hardware not software. Not even firmware)




  • I don’t think the missing pin is a data pin - I think it might be a charge pin and possibly one that is only required for “fast” charging.

    AFAIK a proper lightning cable can provide up to 30 watts, but way back in the early days of the cable the maximum was 5 watts and I wouldn’t be surprised if there are physical differences in the wiring that separates those two speeds. There are probably other speeds in between those two as well.

    They both showed the same speed of around 1400mA at time of testing.

    … yeah that’s only 7W. Like I said, lightning does up to 30W, though with an iPhone 8 the maximum is 12W (or 18W for an iPhone 8 Plus). So it seems my theory is correct, the missing pin is compromising charge speed.

    You said you compared ti to an “original Apple cable”… but not all Apple cables have the same specs. The ones that charge faster are generally heavier and not as nice to use (also, more expensive).


    Lighting is loosely based on USB (Apple was heavily involved in the USB-C cable/connector design process and working on it at the same time as they were inventing the Lightning cable. They seem to have done a lot of things the same in both - since they both had all the same requirements).

    USB can operate with anywhere between 4 and (I think) 14 wires. And a lot of those extra wires are redundant, for example you can fight EM noise (interference from other nearby electrical devices) by sending the same data “inverted” at the same time across two cables running in parallel. If there’s no interference, that doesn’t gain you any performance at all, but if there is interference it could be the difference between a cable that works perfectly and a cable that doesn’t work at all.

    It’s pretty common for cables to have pins that aren’t being used - a USB-C cable with only 4 internal wires will have a lot more than 4 pins… so removing a pin is probably fine. However I’d question what else they also removed… some things are definitely not OK to remove from the cable and many of them can’t be seen from the outside. You’d have to disassemble the cable or do a CT scan.


    As far as I know, MFI certification involves sending sample cables to a third party company that will test the cable and make sure it’s compliant.

    Just because the cables they sent in for testing were complaint doesn’t mean the one they sold you is compliant. In a country with strong consumer protections, you’d be entitled to a refund if someone sold you a non-compliant cable that they claimed was compliant.

    Don’t mess around with this stuff. People have literally been killed by unsafe phone chargers.