![](/static/61a827a1/assets/icons/icon-96x96.png)
![](https://lemmy.world/pictrs/image/0da8d285-3457-4e5b-af21-b38609b07eea.webp)
They sold a bad product that needs fixed, bad software shouldn’t get an exception. The warning icons were probably not compliant and should never have left the factory.
It’s worse than that, people will argue shipping good code is impossible. Good testing is hard, so it’s avoided for things like unit tests. Something that’s only equivalent to basic QA in manufacturing. Every software functions is a design change and the system needs to be fully validated and tested. That’s means driving the car, and not shipping the code and using the users cars to prove your design.
Broken software shouldn’t be accepted as much as it is. Especially in safety critical systems like cars, especially when they remove manual controls for things like steering, brakes, hand brakes and door handles. Fly/drive by wire is more dangerous when the software is unreliable. Mechanical linkages fail immediately or take a long time. Bad software fails in uncertain and potentially chaotic ways.
This was caused by lowest bidder decision making. Along with a tolerance for critical systems designed, developed and manufactured outside of North America and Western Europe. If a country doesn’t have a history of liberal democracy, they can never be fully trusted.
Because there are so many small parts to a processor you need 99.99+% at most stages to stand any chance of mass production. In this context 60-70% is seriously impressive. Millions of things have to be done right to get this type of yield.
In this situation a hub is still better. You can pack all the stuff away plugged into the hub for easier set up. If your plugging that all into your laptop, you’ll need to plug it all back in again when you move.
Pop OS is the same machine as the Ubuntu but with RGB.
It was very likely a designers decision. It forces the use the use case they wanted; wireless mice should be used wirelessly. I would bet they fought marketing and management to get this on the final product.
Marketing would want the mouse they can advertise as being useable with and wireless. Female ports are easier to mount and manufacture with they have depth to set the socket. So a plug on the front is much cheaper and easier to manufacture.
The fact the charging cable doesn’t get used in motion means it will last longer and you wouldn’t have people useing fraying cables on the front of their mouse.
People with exploits available that are unpatched are waiting for that end of support. It increases the value of their unreleased exploit.
It was 12 years ago he said he would put a man on mars in 10 years.
I think it is LED technology. LEDs have a very small bandwidth. Even white leds are just three very small small bandwidth emissions.
The very tight intensity in such a small bandwidth is hard on the eyes. Even when compared with the same power of older lighting technology, which has a comparatively massive bandwidth.
LEDs could be designed to compensate for this better. They could add more different colours of LEDs to the matrix that makes up white LEDs.
You know it’s a thunderbolt connection on a MacBook. They stopped using the USB symbol when they used the usb for thunderbolt and stopped using the mini display port.
This is going to be tracking customers location in supermarkets.
An undeterministic system is dangerous. A deterministic with flaws can be better, the flaws can be identified understood and corrected. The flaws are more likely to be present in testing.
Machine learning is nearly always going to be undeterministic. If they then use continuous training, the situation only gets worse.
If you use machine learning because you can’t understand how to solve the problem, then you’ll never understand how the system works. You’ll never be able to pass a basic inspection test.
When you automate these processes you lose the experience. I wouldn’t be surprised if you couldn’t parse information as well as you can now, if you had access to chat GPT.
It’s had to get better at solving your problems if something else does it for you.
Also the reliability of these systems is poor, and they’re specifically trained to produce output that appears correct. Not actually is correct.
You need software support to use them. But, it’s already common to support this. But it does take time to develop test and deploy this software.
The software will exist in kernels, drivers and libraries. Intel already supports things like this.
You may need to wait or use a bleeding edge version of your os to support these extra features.
Yeah. I think they will struggle to match apple. By the time they do apple will have progressed further.
Another big issue, is these features need deep and well implemented software. This is really easy for apple, they control all the hardware and software. They write all the drivers and can modify their kernel to their hearts content. A better processor is still unlikely to match apples overall performance. Intel have to support more operating systems and interface with more hardware of which they have little control over. It won’t be until years after release that these processors even realistically reach their potential. By which time intel and apple with both have newer releasesed chips with more features, that intel users won’t be able to use for a while.
This strategy has intel on the back foot and they will remain their indefinitely. They really need a bolder strategy if they want to reclaim best desktop processors. It’s pretty embarrassing apple laptop and integrated GPU completely wipe the floor of intel desktop cpus with dedicated gpus in certain workflows, it can often be the cheaper option to buy the apple device if your in a creative profession.
Qualcomm will have similar issues, but they won’t be limited to inferior x86 architecture. x86 only serves backwards compatibility and intel/amd. Arm is used on phones because with the same fab and power restrictions it makes better processors. This has been know for a long time, but consumers would accept this till apple proved it.
I wouldn’t be surprised if these intel chips flop initially, intel cuts their losses and stops developing new ones. Then we will see lots of articles saying intel should never have stopped developing these, there really competitive relativel to their contemporaries, not realising the software took that much time to effectively utilise them.
Extra components mean more specific hardware to complete each task. This more specific hardware can process the same data often faster and with less power consumption. The drawback is cost, complexity and these compose are only good for that one task.
CPUs are great because they are multipurpose and can do anything, given infinite time and storage. This flexibility means it isn’t as optimised.
People are not creating custom code to solve their own problems. They are running very common applications, using very common libraries for similar functions. So for the general user specific hardware for encryption, video codecs, networking etc will reduce power consumption and increase processing speed in a practical way.
I don’t think he’s suggesting it isn’t open source, just we need more open source engines.