Recent comments in /f/gadgets

seiggy t1_j46s4xa wrote

Think you're confusing 1-3MHz with 1MHz-3GHz. In the 1990's, chipsets were running in the 100's of MHz. The first 1Mhz chip was in the 1970's, the first commercial PC, the Altair 8800 used a 2MHz Intel 8080. The original IBM PC in 1981 released with a 4.77 MHz CPU. In 1995, the Intel P5 was running at 100MHz, and in 2000 AMD released the first 1 GHz CPU.

1

aliendepict t1_j46o8u8 wrote

I also do work in this field... AI is referred to when a computational device is being used to imitate or perform a human like interaction, or is facilitating something that typically would require human level intelligence.

This object was likely trained using ML algorithms which is where they get the "AI based" as ML is a subset of AI and refers to the training of such things. So in that way it is not incorrect. My gripe is that with how loosely the likes of Microsoft, Google, and Co have defined "AI" we will not be able to hold truly meaningful conversations about what AI is. The definition has lost value when everything from ChatGPT that facilitates true AI qualities and values, to my Samsung TV simply turning on because I walked into a room is being called AI, how do we delineate to the masses?

TLDR: I'm a grumpy codger that is tired of marketing dumbing down definitions to simply sell things, that have existed in some form or fashion for a decade and revitalizing it by slapping the term "ai" on it.

20

barzamsr t1_j46mmtp wrote

I agree that the word AI is too "markety", but I think you've fallen into the same trap.

The most commonly held definitions of AI amongst actual experts in the field lie along the lines of being able to use information to make decisions in pursuit of a goal.

If "pre programmed algorithms" use information about the position of your head to direct hardware to "beamform" (apparently that's an actual thing?) with the goal of improving the quality of the sound you hear, then it is perfectly fair to call that AI.

AI doesn't necessarily or by definition have anything to do with cloud computing, machine learning, or predictive analysis.

63

JKBone85 t1_j46lxao wrote

I’d love a Sears kit home. If Amazon used the Sears sales model, where they are the finance company, and they could sell pourable 3D homes for say 50k, 80k, 100k tiers, they’d clean up, and actually provide affordable homes and more opportunity for home ownership.

1

BakesAndPains t1_j46kk8q wrote

Yep! 3d printing goes layer by layer from the bottom up, and is of course subject to gravity. When printing “bridges” over empty space, it’s easy in small plastic prints to use supports because plastic filament is cheap, but these same bridges are much much much harder and more expensive to do with concrete.

This is a real breakthrough of 3D printed building usefulness. Houses will be closer to what humans expect and want from a domicile, and able to house twice as many people on the same acreage.

1

olqerergorp_etereum t1_j46j99l wrote

>you just told them their XM4s were bad, which isn't true.

because they are, it is truth. they're decent if you need headphones with active noise cancellation, but compared to open headphones, they sound like ass. this is factually true due to how ANC technology works, you cannot make a ANC headphone and expect the audio quality of a pair of HD600 for example.

>You didn't really address why Spotify "is not a great source for music anymore."

low quality low bitrate music, sounds worse and this is factually true, again, it's not something I'm making up, Spotify is not lossless, has never been.

>Most people who support adapters do so because it allows for one less hole on their device and a slimmer profile for aesthetic purposes.

and thanks to that one less port, now you cannot use wired headphones while charging your device, bravo.

>People who know about hardware also like adapters sometimes because you no longer have a single point of failure for headphone connections. It's easier to replace an adapter than replace the headphone port on your phone

I've NEVER, EVER seen a headphones jack failing, but I've had lots of cables fail due to natural bending of the cable due to normal use. I've never ever had the necessity to change a headphone jack on my phone, not even on the phones that fell into water, the headphone jack always worked just fine.

>Additionally, is you're really obsessed with "luxury audio," you'd almost certainly need to buy an external DAC for your phone because the built-in one is nowhere near audiophile quality, which would require using a different port anyway.

I already have a DAC and they can connect either via the standard 3.5 jack or USB c jackz i don't really understand your point here, what's the problem with having headphones jacks? just aesthetic? just a slimmer phone that you won't even notice in daily usage??

I'm not trying to make you mad here or sound condescending. I'm not a native English speaker so sorry for my tone. and please don't feel offender because I said the XM4 were bad, they're not, but they're average at best, I've used them before, and they just aren't high fidelity headphones, they're bassy headphones for the general market and that's ok because that what works with most people, but not for those who look for true high quality audio.

3

aliendepict t1_j46ix69 wrote

"AI" is becoming a little too markety... Is this thing zipping up a ton of my information and sending it back to a cloud to be computed, then learning how I typically move my head and using predictive analysis to anticipate that?

Or... Is it just using pre programmed algorithms to match my movements to the speaker position and moving the speaker?

I'm guessing the latter... So not an intelligence at all...

198

r_golan_trevize t1_j46een2 wrote

I should also point out that the steps between 0 to 1 to 2 to 3mhz ghz were not linear at all. 0 to 1mhz ghz took from the dawn of computing to the late 1990s and then we went very quickly from 1 to 3 mhz ghz in the span of just a few years and then we leveled off around 3.4mhz ghz very quickly after that. It wasn't really linear at all.

1

[deleted] t1_j46clxd wrote

The circuit is designed with that spec in mind but the actual breakers you find in use don't trip until 15-20% over their rating. No they do not trip at 80% unless the breaker is fucked.

-source me, EE, work with this shit quite literally every day and trust me, you are not tripping at 1400w and no one is assuming that we only have 1400w available when designing products. And yes I do work on things that pull that much or more power continuously and we are using residential breakers for testing. The reason a device might trip a breaker when it's continuous rating is 1400W is if it pulls an instantaneous load over ~2-2.5kW. but under normal operation a 15A breaker will never even sneeze at a 1400w continuous load.

2

[deleted] t1_j46c1rx wrote

Uh no this is completely wrong. Not unless the breakers are really old and need replaced. Breakers typically won't trip until about 15-20% ABOVE their rating. So a 15A breaker wouldn't trip until you start pulling over 2Kw continuous/2.5kW instantaneous. Most modern houses are using 20A breakers and 12ga Romex now anyway so there's very very little chance of someone tripping a breaker at 80% of a 15A rating.

1