Tech News Round-Up

How about that meow situation?

Having an unsecured database connected to the Internet in 2020 is like leaving your wallet in an unlocked car with the windows down parked in a busy downtown.

1 Like

I don’t even know where to put this one.

1 Like

Guess leftists really love their trains.

3 Likes

OH SHIT

Come on antitrust regulators where you at?

http://sandlab.cs.uchicago.edu/fawkes/#code

So Fawkes just came out with version 0.3. The tl;dr on it, is you take a photo and run it through it and it inserts noise imperceptible to the human eye but devastating to machine learning algorithms trying to build a dataset of pictures of you.

Pretty cool and an example of that concept I’ve been seeing a lot where we’ve spent a lot of time getting ML to work with perfect datasets. It’s gonna be a long time coming before we are able to handle adversarial datasets like this.

That said I downloaded it and am running it on an image of me I took in college. It took a while, like a few minutes. That said side by side they look the same.

So I guess the idea is run all photos that you put online through this and new photos of you won’t match as you if you do this long enough.

Kinda cool.

Edit: In the interest of keeping detail, while the thing runs it happily maxes out your cpu. So if you were to actually try and like do this to all your photographs, I’d maybe set up a pipeline that does it on another machine. And it’d be rude of me to not give examples so I did it again on a photo of an actor I had lying around:
This is the original


This is the one manipulated to not be recognizable to facial recognition software:

3 Likes

https://youtu.be/blse_p2uNXo

Basically YouTube is getting rid of the ability for people to suit subtitles to videos, which is part of a bigger problem.

Video owners can still create and apply subtitles. They only got rid of open community submissions.

The Linux kernel patch submission method, then.

Looking like more of a sure thing.

https://arstechnica.com/gadgets/2020/09/nvidia-reportedly-to-acquire-arm-holdings-from-softbank-for-40-billion/

Update: It’s a done deal. This is going to change the face of computing. Get ready for the all-NVidia PC. Intel getting sadder every day.

1 Like

TechRadar: Nvidia might finally have made video conferencing OK.

Samsung’s new SSD with FGPA to improve drive search speeds

I was thinking the all-NVidia computer would come, and I still think it will. But how good a chip can Microsoft make?

Here’s a better question.

Did Apple make something magical? Or did Apple just free themselves of the shackles of old architecture and capture those low-hanging gains?

1 Like

If anything has been consistent since the very beginning, Microsoft/IBM/PC has always been the land of backwards compatibility, and Apple has always been the land of killing off the old to bring in the new.

People always talking about Google’s graveyard of things they started and then abandoned. Google is a mercy killing, putting down their own unloved unused children. Apple straight up murders shit while it’s still hot. I mean, did you see what they did to Flash and the headphone port?

Apple is happy to make sure an individual Apple product has a very long service life. It will continue to function properly and get updates for many years. But a new product has no guarantees of being compatible with any old software, accessories, peripherals, etc. They make you get with the new.

I think the switch from the Apple line of devices to the Macs in the late '80s set the precedent. They just repeated it for OSX, x86, and now ARM. Meanwhile, a brand new Windows 10 computer it’s rather shocking how old a DOS/Windows program you can convince it to run without employing any extra tools like emulators or virtualization.

I think Microsoft’s strategy provided a lot of strength in an era where people had their data in locally stored files, and those files had proprietary formats. You absolutely needed to run a particular program, and maybe even a specific version of that program, or your data was effectively all lost. Any upgrade needed to keep that program working perfectly.

It was also an era where people paid for software, and thus they didn’t want that software to stop working. Not only that, but the software they bought was a specific version for a specific platform. Even if the program was actively maintained, and new versions got released, they might cost money. Rebuying software sucks.

But now we are in a cloud world. People don’t even really have files. If they do, they are probably in standard formats supported by all apps. Most software is free for all platforms. If you paid for software, you can just freely download a different or upgraded version if necessary.

And almost all software is just a user interface for an Internet service. As long as that service exists, they provide clients for all platforms. If they die, the software becomes useless anyway, so why would you want to run it?

When all your shit is on the Internet on someone else’s computer, you can just change your client side device to whatever you want. The only people that can’t are a small set of pros and specialists that need or want to run old software or do very particular things. In such a world, people can just easily move to whatever the best available platform and hardware is, and the lack of backwards compatibility is no big deal.

You just over-simplified, over-genralized, and misrepresented an impressive amount. But I mostly agree with the general thrust of your point. 30 years from now nerds will fawn over their Intel x86-based systems they have maintained or restored and most people who want to run old software will do it via virtualization.

It is interesting that as the death of Moore’s Law has been predicted/anticipated for decades all I ever heard was “something will have to be figured out when that happens” but I never heard the bet that Intel would be passed by an upstart chip architecture while completely failing to even attempt an alternative. These past few years have they been wrongly focused on developing a GPU to compete with Nvidia & AMD (ATI) while being oblivious to this need/threat? Talk about fighting the last war.

Intel’s woes are largely due to dysfunction within the ranks and leadership of the company. If you search around you can find quite a few testimonials from former employees talking about how it has been a shit show for quite a few years. Here’s just one.

The company has lost focus. Lost talent in layoffs and resignations. Made lots of questionable engineering and design decisions that have received criticism even from like, the Linus Torvalds of the world. And they keep having repeated delays in rolling out their new manufacturing processes.

It’s a lot like when you see a dysfunctional sports team. The only fix seems to completely replace everyone at the top. It’s been well known for years, and going on for longer than that. I’m surprised the shareholders haven’t taken action by now.

https://www.phoronix.com/scan.php?page=news_item&px=Intel-CEO-Pat-Gelsinger&utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+Phoronix+(Phoronix)

They’re trying to turn the ship around.