![](/static/253f0d9b/assets/icons/icon-96x96.png)
![](https://lemmy.world/pictrs/image/d82718c7-5579-4676-8e2e-97b4188f10d3.png)
My favorite role was Baron von Butcher.
My favorite role was Baron von Butcher.
Pussycat.
Yes, mostly sherbet. But there was also a vanilla with chocolate sauce swirl version too.
PushUps was the brand name I remember.
I think it is highly likely the ice cream manufacturer ordered the rolls from the same company making toilet paper rolls.
Harley Davidson and the Marlboro Man can sing
It’s a real artist to be at the peak of popularity and play at a Bass Pro shop.
It’s Algebra 2. I just checked and only 6 states require it. Crazy. I was in a state that didn’t require it but finished Calculus 2 at graduation.
Math is personalized in American schools. There’s on grade, advanced, gt, and accelerated. Each level above on grade is how many years ahead your class math is. Depending on how large your school is, gt and accelerated math students will take math with the grades above them.
On grade would be quadratic in 9th.
Perhaps this is why these features will only be available on iPhone 15 Pro/Max and newer?
I’m not guessing. I linked to the article about the M3 which is much more powerful than the a17 pro in the 15 pro and has the same NPU.
Nothing AI about it.
Voice processing is AI and was done by Apple servers. Previously, only the keyword “Hey Siri” was local. Onboard AI chips will allow this to be local. The actual queries will go to the servers. Phones do not have the power to run useful LLM locally- at least not with the near instantaneous response times phone users expect. A 56 Watt 128GB RAM M3 Max does around 8.5 tokens/second.
That Google is the search engine means Google gets that valuable search data. So they pay to be the default search engine to get your data.
Because Apples lawyers will go ham.
Google pays Apple $20 billion a year to keep their search on Apple devices. The subtext of “search” is Google pays Apple for your search data.
Apple has sold your data for the right price to Google, so there should be no expectation that they won’t do the same with other companies.
Which is exactly what I said. It’s not local.
That they are keeping the data you send private is irrelevant to the OP claim that the AI model answering questions is local.
Most requests are handled on-device.
Literally impossible.
“Hey Siri, what’s the weather forecast for tomorrow.”
< The Farmer’s Almanac that is in my local model says it will rain tomorrow. >
Well, most of the requests are handled on device
Doubt.
Voice recognition, image recognition, yes. But actual questions will go to Apple servers.
Used servers/workstations are likely more reliable than new consumer.
They were very likely kept temperature controlled, have ECC, and are actually known working instead of something like Asus. If I remember correctly, PC mortality is very high the first 6 months, goes down to near zero for 5 years, then starts going back up.
Replace the SSD/hard drive and you are good. You might not even have to do that. I checked the stats on the SSD that came with my used Lenovo workstation and it had like 20 hours on it.
How small? How many drives? I bought several used Lenovo P330 E2276G for my servers.
The Intel CPU has great low power GPU for video encoding/decoding for video streaming.
The Xeon ECC ram gives long term reliability. It’s important if you leave your PC on 24/7 for years at a time.
Smaller doesn’t need to be more complex. 3.5" drives weren’t more complex than 5.25" drives.
A smaller head means a smaller drive actuator. Less mass and smaller size means it can compensate much quicker in response to vibration detection.
Back when full height 5.25" drives were the norm, you couldn’t pick up your PC while running without causing an error. Those tiny CF card sized drives failed but took extreme abuse compared to big drives.
Did some Googling and found he was in a sketch in a 1994 episode of Saturday Night Live that was a TNG / Love boat crossover.
But I can’t find a clip on YouTube!
Ten Forward needs this!