How Apple Just Changed the Entire Industry (M1 Chip)

Visualizações 4 771 653
99% 151 000 1

Sign up for Morning Brew today for FREE:

ColdFusion Merch:

A quick note - The A14 comparison vs Intel i9 was normalised for single core performance.

--- About ColdFusion ---
ColdFusion is an Australian based online media company independently run by Dagogo Altraide since 2009. Topics cover anything in science, technology, history and business in a calm and relaxed environment. In this video we take a look at Apple’s M1 chip or Apple Silicon in the new Macbook.

Interview with ARM founder: • ARM Processor - S...

If you enjoy my content, please consider subscribing!
I'm also on Patreon:
Bitcoin address: 13SjyCXPB9o3iN4LitYQ2wYKeqYTShPub8

--- New Thinking Book written by Dagogo Altraide ---
This book was rated the 9th best technology history book by book authority.
In the book you’ll learn the stories of those who invented the things we use everyday and how it all fits together to form our modern world.
Get the book on Amazon:
Get the book on Google Play:

--- ColdFusion Social Media ---
» Twitter | @ColdFusion_TV
» Instagram | coldfusiontv
» Facebook |

First track: Burn Water - Nostalgia Dreams • Burn Water - Nost...

Other Tracks:

Shallou - Love

No Spirit - Careless

Montell Fish - Jam Session

Cody G - With You

Emancipator Greenland

Oma and Amberflame - Tropical Capricorn

Young American Primitive - Sunrise

WMD - Sentimental

Edward Sharpe and the Magnetic Zeros - Life is Hard (Teen Daze Remix)

Aerocity - And Ourt Hearts Beat Together

Abhi and Dijon - work.pool.bed

Burn Water - Burning Love

» Music I produce | or
» Collection of music used in videos: • ColdFusion's 2 Ho...

Producer: Dagogo Altraide

Ciência e tecnologia

Publicado em


26 Mar 2023



Baixar vídeos:

Carregando o link.....

Adicionar a:

Minha playlist
Assista mais tarde
Comentários 13 087
Paras Arya
Paras Arya 2 anos atrás
I remember you saying like 2 years ago, that Apple should include ARM processor, and now here we are, it happened
The Hobo Goon    3 u w
This wasn’t him. This was the result of a hacker group having a conversation with Steve Jobs before he passed away. This was us. Stop taking credit for our work and actions. We worked very hard to make our case to Mr. Jobs, and he agreed.
lesley haan
lesley haan 2 meses atrás
@Aleksandar MakedonskiWOW you made 95 comments on this video you must really hate Apple to the core what did Apple do to you
Let's not pretend that apple just don't want to leave ANY money on the table either. Soldering everything on the mobo is very ...Apple. No expensing route but to buy a WHOLE new computer from them, at a premium price of course. They are succeeding where Sony has failed.
Fair Beauty
Fair Beauty 3 meses atrás
Correct! I have a feeling Apple can read our brains. I always remember saying inside of my head to myself. “I wish my iPhone can unsend and or edit text messages”. NO LIE!!! Now here we are we are, we are now able to edit and unsend text messages.😳I’ll never forget saying next “how TF Apple knew this was in my brain as a wish!!??!!”😡
Max Tech
Max Tech 2 anos atrás
Thank you for including some clips from our channel. We are witnessing a computing revolution!
Tempo Passing
Tempo Passing Anos atrás
Yes considering your content these days is 90% worthless clickbait just for affiliate revenue its quite surprising.
Matt Lord
Matt Lord Anos atrás
@aG3nt oRanGe LOL... Thinking Scalable isn't in your vocabulary, Unless you consider AMD' infinity fab etc.. What happens when a server priced equivalent, at 1/10th the power draw happens? Ever better, A modular version? If that happens, and I may well might, You'd be saying by to x86... Its old, forked everywhere...... You may have a hat... still, but maybe, soon you won't. Remember , some of the first calculator chips had NO competition.... Think about it... Just sayin'
aG3nt oRanGe
aG3nt oRanGe Anos atrás
Indeed we are, from Intel and AMD. Not this Apple calculator chip.
Y TB Anos atrás
@K.O.D. - He featured them in 'Person' he didn't steal contents from their channel
Akash Garg
Akash Garg Anos atrás
In my computer architecture class, a common question was, which is more recent, CISC or RISC, and almost everyone got that one wrong. People naturally assume whatevers more complicated must be more "advanced" or newer, when no, it turns out the innovation isn't making it more complicated, but doing more things with less instructions. Thats what the innovation is. That lesson ought to be applied to lots of things :)
DurpMustard 10 dias atrás
If there is a head of the head on a nail then you just hit it
Goldenblakcon 14 dias atrás
innovation just means new concept or finding different ways.... it being easier or complicated has nothing to do with it
Δημήτρης 18 dias atrás
CISC made sense back when compiler optimization was primitive and it was assumed that all complex programs and OSes would be written in assembly,. Back in those days, an advanced CPU was expected to minimize the "semantic gap" between human programmers and assembly. So, the designers of the day weren't dumb or lacking in innovation, they were designing for a different goal.
Roger Knights
Roger Knights Mês atrás
IOW, “Less Is More”!
Flint Westwood
Flint Westwood Anos atrás
Very enlightening. Simply by upgrading my phone a couple of times, I had a subconscious awareness that mobile processors were catching up to desktop performance at a surprising speed. But I had no idea they had come this far.
Derek Henschel
Derek Henschel 6 dias atrás
apple products are pricy, use newest tech, break more often faster, and have amasing usability, over all cost more and depending on the product you might not be able to do everything you want. android phones can be pricy depending on who makes them, they can have the newest unproven techm they can break easily and often, they can have amazing usability, the overall cost can be less while getting you everything you want to do. windows and linux computers are the same, and can get you everything you want to do, even for less money. apple isnt bad, but they make products i cant rely on. they make products that are flashy with tech people who buy them usally dont understand or use, and companies that make android phones like samsung do the same thing. computers of the linus and windows operating systems generally dont do that and if you need to you can easily change parts of them to make it fit your needs. i like to say the older companies with the less known products often do it best, motorola, say what you will about them, make better, cheaper, more relyable, and better for the consumer products then samsung or apple when it comes to phones, and its an android, so that would mean i say the best phones avalible are androids, thats what alot of people dont understand, its not always about the fact apple is bad and android phones are bad or whatever, its who makes it because android phones arent just one thing made by one company like iphones, apple doesnt have anyone who uses ios on phones for a reason, no one wants to work with them and they dont want to work with anyone else, and there monoply on there operating system and devices that use it is why there products get seen as so bad, there arent options for the consumers there stuff doesnt work for that use there systems.
James Brendan
James Brendan 3 meses atrás
@Cloud11 for gaming? probably. For work? maybe, but then we can just argue that using a feature phone is far better than a smartphone because we don't "really need that much power" on our phones
James Brendan
James Brendan 3 meses atrás
@William Young same things go with other brand products, too. I think you forgot the fact that technology evolves with time, lol
Luigi Weegee
Luigi Weegee 3 meses atrás
@Aleksandar Makedonski I don’t think apple products are “dumb”, also isn’t apple products literally outperforming every android phone out there, and the fact it last over 5 years vs 1 or even 2 years, i’m not defending apple but just saying the advantages of iphone over android.
Aleksandar Makedonski
Aleksandar Makedonski 4 meses atrás
@William Young its Foxconn products with apple logo and dumb customers will buy it 😂
S S 3 meses atrás
Very late to the party. Only found this channel a month ago. Such great work. I’ve almost finished all content and look forward to what’s coming next. Amazing channel, describing amazing people and events.
Lee Craig Stewart
Lee Craig Stewart Anos atrás
In the last few years, we've seen mobile technology surpass desktop/laptop technology. So, putting "mobile-powered" chips into desktop class devices seemed crazy at first, but with M1, we've seen a remarkable level of power with no compromise to battery life. Something we've already witnessed for years with the iPhone and iPad's A-powered silicon chips. M1 has literally pushed the computer industry forward to depths and realms that we could've only dreamt of just a few years ago.
Bro how mad are you LOL
@eurosonly never seen anyone say that. They have industry leading battery life.
eurosonly 5 meses atrás
And yet people still say their iPhone has horrible battery life.
Mark Schwartz
Mark Schwartz Anos atrás
The M1 is amazing, and it's only going to get better from here (and at a faster pace than intel/amd). However, I think it is a sickening trend to have memory and ssd built into the motherboard. Not only can you not upgrade your storage, but you can't customize anything about it. And when the RAM or SSD breaks, not only can you not buy something to replace it, but it can also cause the entire system to become unstable. Apple doesn't want you to have the ability to fix your own computer if it breaks, they want to sell you a new computer. And considering Apple's impact on other tech companies, this should be reason to worry.
Nick Mês atrás
We’ve already seen it, with the death of the headphone jack (which I admittedly don’t miss but it would be nice for people who do), and the lack of micro sd slots on some newer phones
M Isa
M Isa 2 anos atrás
I feel like every Processor manufacturers is just killing Intel right now
Erik Kubica
Erik Kubica 2 meses atrás
So much fun reading those comments. Guys, I hate apple, I got into ugly arguments with my friends multiple times. But I got recently (3 weeks ago) a company M1 Air from 2020 for iOS development bcs, apple does not allow to develop iOS apps without xcode (another reason I hate apple). I have a Dell XPS 15" i7 16 gigs of ram, dedicated GPU which I bought for 2300€ and the M1 Air which costed 1600€. Guess what? Despite of my hatred to apple, the M1 beats that XPS in every aspect except when it comes down to games. 15" XPS OLED 2019: - Battery life: 3-4 hours working in "power saving mode" - Heat: Hot to touch, too hot to be on my lap or any surface that blocks air from ventilating - Good performance (when plugged in and not in power saving mode) M1 Air 2020: - Battery life: 10-14 hours of work in what ever mode is by default if it's even possible to use some low power mode on a ARM chip - Heat: Always cold / room temperature, I can put it anywhere, anytime - Same level of performance when it comes to work (except gaming) So which one is do you think more productive, comfortable and cheaper? Energy prices are high, these times, what is cheaper to use? The m1 that barely uses any energy or the XPS with i7 which I can cook scrambled eggs on it? When you buy a laptop, how often do you upgrade the cpu? never. gpu? never. HDD to SSD? now days nearly never because they are already fast (and the macs for long run have faster SSDs as any other non-mac product) When it comes down to performance, the M1 can do 90% I do on a laptop for work (android, ios apps, web apps, php sites) as fast as it can do my XPS it's using massively less energy to do it so there is no "low battery anxiety" anymore, it's easier to carry it, does not burn my hand/legs,... it's clear winner for me for work. While it costs less to buy and reduces electricity bills. And if I want to game I just got a gaming PC which for the same price as a gaming laptop will out perform any gaming laptop.
Michael Morgan
Michael Morgan 4 meses atrás
@Kurb Najung Lmao. not even close. Apples CPUs perform faster and use way less power than equivalently sized Intel CPUs. This is a hard fact brother.
Kurb Najung
Kurb Najung 4 meses atrás
The new ARM processors used in Apple are using the same old tech Intel used to integrate graphics into the CPU. So Apple is about 12 years late, but you claim isn't killing Intel? Well AMD is forcing Intel to innovate, but Apple isn't even in the same league as x86.
Michael Morgan
Michael Morgan 5 meses atrás
One year on and Intels 13th gen is faster and cheaper than AMDs latest ZEN CPU. It seems that the sleeping giant awoke!
TheHvk 5 meses atrás
@BlueCarbon Stop, it's already dead!!!
Jaime Rojas
Jaime Rojas 9 meses atrás
It is really cool to be able to witness this kind of development in the industry, I'm exited to see what comes next
Michael Erwin
Michael Erwin 3 meses atrás
Though I retired from the semiconductor industry 20 years ago, I still follow the technology and when I first got my hands on an M1 white paper, I must admit I didn't believe Apple could pull it off. Kudos to the ASIC Engineers who did this...absolutely unbelievable!!!
Jakob Riedle
Jakob Riedle Mês atrás
Would love to hear more of your thoughts on this!
13thChip Mês atrás
What would have been the most challenging hurdle they would have faced in desigining the M1 chip?
hausy 4 meses atrás
I will say this for Intel. I still have a 2nd gen i7 in my gaming PC that I bought in 2010 or 2011. It’s starting to struggle these days, but for general use and games, not running at ultra settings obviously, it still works well enough that I’m not bothered at all. I’m honest to god impressed with its longevity. I expected to replace it a long time ago.
Will Stone
Will Stone Anos atrás
Dagogo, Thank You for your content & consistently high production values. I always feel that time spent watching one of your videos was time well spent. Keep up your superior work 👍🏼
S Dias
S Dias 7 horas atrás
This channel has been consistently at the top of tech news for such a long time ! Cheers!
Isssma / イシュマエル
You can say that Apple took the RISC.
Aart Jansen
Aart Jansen 3 meses atrás
Second risc, first time didn't end well.
Fred Schnerbert
Fred Schnerbert 3 meses atrás
@David Wiles Yes, RISC invented by IBM the 1970's, about the same time Neural Networks were first worked on...just didn't have the computing power....yet...
Fred Schnerbert
Fred Schnerbert 3 meses atrás
*The first prototype computer to use reduced instruction set computer (RISC) architecture* *was designed by IBM researcher John Cocke and his team in the late 1970s. For his* *efforts,* Cocke received the Turing Award in 1987, the US National Medal of Science in* *1994, and the US National Medal of Technology in 1991.*
THE GREAT AWAKENING 11 meses atrás
@David Wiles agreed
Jerry Flynn Dale
Jerry Flynn Dale Anos atrás
VerveTech Anos atrás
Though Snapdragon 8CX was one of the first mainstream ARM chips in a laptop, it had issues with some X86 specific apps because Qualcomm didn't create a dedicated X86 processing core within the SoC like Apple did with their M1 SoC a year after.
GoldGun Anos atrás
They did it again with M1 max and pro chips. Insanely powerful and efficient in power management.
Zagisa 7 meses atrás
Not really. Once they reach the same voltage as x86 and try to perform as well as x86 in gaming or other fields where ARM does not play well (basically anything else besides media), they're actually worse at power management. Minecraft recently dropped full ARM support, fastest M chip performs just as good as a 10 year old PC.
Alex 3 meses atrás
Compatibility: to be fair, there are _some_ apps that have trouble running on M1. But their number is very limited and it's very specialized software (e.g. some images for docker)
Ceasar Salad
Ceasar Salad 2 meses atrás
Not anymore, I have everything I need installed right now
Lois Sharbel
Lois Sharbel Anos atrás
Watching this 10 months later and now I'm wonderfing if other computer manufacturers have jumped onboard and are producing computers more affordable than Apple. Thank you for your amazing work, Dagogo! Love all your videos!
mike mcmike
mike mcmike 2 meses atrás
@Justin Wyllie lol what are you smoking: the risen 9 3900xt has a slightly lower geekbench score than my 16 inch base model 10 core M1. Oh and it’s a desktop cpu and draws 105watts
Justin Wyllie
Justin Wyllie 9 meses atrás
The m1 Mac is a good deal only when compared to another Mac and looking at power draw. It's equivalent to a 6 core ryzen 5. Those are like $300 dollars
Shane P.
Shane P. Anos atrás
@FullFledged2010 I never said that. In all likelihood, the Apple Silicon chips will be more expensive to make than x86 chips because of the ginormous cache they’ve allocated. The difference is Apple makes profit from the final product, not the chips, while Intel, AMD, and Qualcomm need to do so, so Apple can afford to make those trade offs.
FullFledged2010 Anos atrás
@Shane P. What makes you think the m1 chip is cheaper to produce than x86? Its about the same size die and on 5nm. Sure the power circuit could be cheaper but i highly doubt that would the actual laptop any cheaper 😑
The Storyteller
The Storyteller 9 meses atrás
I love the quality of editing and research that you guys do. That's the second time I'm watching this video because this topic is so interesting.
Lemon Squeezer
Lemon Squeezer 2 anos atrás
If i was Intel i would be in serious panic mode right now. First AMD, now Apple blowing Intel out of the water in a short amount of time.
Gadavillers Panoir
Gadavillers Panoir 5 meses atrás
Celeron rulez!!!
Johnson Debrah
Johnson Debrah 11 meses atrás
Why is everyone talking as if nobody is going to buy Intel products again? Remember those chips will not be sold to any other company. The vast majority of people in low income countries still use windows. I can't afford an apple product so is inmaterial to me . Won't benefit in anyway since they would be found in only apple products.
Debt Collector
Debt Collector Anos atrás
meow meow meow
fatalityInOne Anos atrás
​@Tremor244 I would like a citation on that thirty times more power consumption, that would melt any cpu in an instant and burn down your house.
Cat Anos atrás
This is one of the best explainers out there. Now here we are with their new set of processors. The M1 Max has been a game changer for me. I've been able to use CAD in a VM, while using Xcode, all on battery without my legs melting off.
Scott Johnson
Scott Johnson Anos atrás
I think this will make VM take off.
KingKaitain 11 meses atrás
It’s kind of cool that a British CPU design (ARM) went on to be the most successful in history.
Gideon Kok
Gideon Kok 3 meses atrás
This seems to be a trend from intel's side... no vision, future thinking or a forward mindset. I remember watching the defiant ones and intel told Record Exec Jimmy Iovine something similar, nevertheless Jimmy hooked up with Apple, Job's and Tim Cook and now Apple has got Beats which served as a them very well in setting up Apple Music and the AirPods tech. it's sad really when you think about it.
innosanto 2 meses atrás
Intel is not doing well recently, AMD also has sone better last yeats
cdsmock Anos atrás
I debated between an Intel iMac and new M1 Mac Mini...went with the Mac Mini. Glad I did. It's insanely powerful for the price/size, and I can hook it up to any TV or monitor I wish going forward and will be fast enough for me for years to come.
John Edge
John Edge Anos atrás
Hewlett Packard had their own RISC architecture called PA-RISC used on their HP3000 and HP9000 (Unix like). Was quite innovative in the mid 1990s. Some successors in use until early 2000s according to Wikipedia.
Joe B
Joe B 2 anos atrás
This feels like the first actual big innovation from Apple in a long time. I want one
Flash Anos atrás
@Well Prophet!
Flash Anos atrás
@aG3nt oRanGe His comment aged like fine wine. Yours on the other hand...
MrDjBigZ Anos atrás
Not really
Naturinda Eli
Naturinda Eli Anos atrás
@Well I love how well your comment ages😂❤️
aG3nt oRanGe
aG3nt oRanGe 2 anos atrás
@Omar Oh it's good. But you won't see it paired with a GPU any time soon. And it won't be able to match that performance alone. FACT. Does that sound less fan boy for you.
Andrew Gerald Nation
I love your channel and greatly appreciate the way each episode is designed to enrich the viewer with knowledge and bewonderment. ✓
Analog Twleve
Analog Twleve Anos atrás
An important bit of history was left out: ARM was founded in November 1990 as Advanced RISC Machines Ltd and structured as a joint venture between Acorn Computers, Apple, and VLSI Technology.
Nufosmatic 3 meses atrás
13:30 - PA-Semi used a PowerPC instruction set, but had a sophisticated power management system (like 5 different power supplies). The embedded computing market, expecially defense, was going over big for switching from PowerPC to PA-Semi. And then Apple happened, and several embedded computing companies died as a result.
Apparently PA was also building an evolved form of PowerPC that was more powerful yet consumed less power than the G5. The problem is that it would have took too long, so Apple went with Intel instead, although said evolved PowerPC CPU was used in the AmigaOne X1000.
Joseph R
Joseph R Mês atrás
Its the future, I say this with confidence. Especially now we've got the M2 chip. I can't wait to see the day we get native chips from AMD and Intel which are ARM based, perhaps even custom chips direct from Microsoft and other companies who have also licensed from ARM.
Rishiraj Agarwal
Rishiraj Agarwal 11 meses atrás
I love your storytelling. Have recommended this video to so many of my friends. How about making a follow up video on intel’s new strategy in lakefield chips and the future? In competition to m1?
mike mcmike
mike mcmike 2 meses atrás
@Quantum Infinity they just released, you guessed it, an incredibly power hungry hot chip! Big surprise! Barely beating the m1 while drawing 130 watts: they just won’t learn
Quantum Infinity
Quantum Infinity 10 meses atrás
Intel hasn't really launched a direct competitor to the M1 yet in the lower power categories, so we wont know until their later chips come out they said that are supposed to be more of a direct competitor.
David Newbaum
David Newbaum 2 anos atrás
The quality of this and other similar channels makes you wonder about the future of classic TV programs.
VanSet Productions
VanSet Productions 10 meses atrás
@DaX Nah, TV will be dead sooner rather than later. Kids these days don't want sitcoms or dramas. They barely have the attention span necessary to watch TikToks...
Ryan 11 meses atrás
@Tyler Melton I think you just destroyed a small part of that man's soul. Lol
Galaxy 🌌
Galaxy 🌌 Anos atrás
@Bobby Blondino this is not apple related channel or related to any company. he is just talking about how ARM chips are changing the industry. now apple comes into the discussion because they did make good quality ARM chips better than intel's X86 architecture. if you had watched his other videos then you would understand it is not a channel related to any company.
kittyhooch1 Anos atrás
@T. R. speaking of old people and TV I turn 65 in a month. I cut the cable nearly 20 years ago and got a Roku. A few years back I realized I was using my tablet and not my 55 inch 4K TV and got rid of it. Generally people my age still watch TV but generalization in claims is bias. Also my dad just turned 87. Both of us are planning on being around a while.
Nick Ames
Nick Ames Anos atrás
I DON'T watch hardly anything on classic TV anymore. What I do, like PBS reports, I watch on my phone
CryptoMood81 Anos atrás
Apple A & M1 chips are an absolute beast which are revolutionizing the way computing works...A14 Bionic is clearly faster and more importantly much more power efficient than the best Intel have to offer at the moment and now with the A15 that rift is only going to widen...way to go apple
MrDjBigZ Anos atrás
“laissez faire”
The thing I know is that Intel can turn up the heat when they feel is necessary. But I’m excited for these M1 chips. Great video 👍
Haydo 11 meses atrás
@“laissez faire” yeah that's fair, I'm drawn to both apple and amd for different reasons, each has their own strengths
“laissez faire”
“laissez faire” 11 meses atrás
@Haydo funny enough, my son got the new i9 and it’s faster than my M1 max. But my M1 max still feels snappy while I’m editing so it I’m happy.
Haydo 11 meses atrás
... fan spools up
“laissez faire”
@Waynegro I agree but you could’ve told apple the same thing a few years back and now here we are with the M1 and the new ones coming out.
Waynegro Anos atrás
It’s been necessary for 6 years
Mr Obscure Universe
Before ACORN, the MOS6502 had a reduced instruction set pared down from the Motorola 6800. And it's still in production, too.
Christopher Looby
Christopher Looby Anos atrás
A few factual nitpicks here- ARM was not the first RISC: Sun Microsystems, Hewlett Packard & Digital Equipment Corp all brought out RISC platforms around the same time as Acorn. They all cameout of MIPS and the Berkeley uni RISC project. Secondly, The Apple A12 etc aren't really ARM any more, they are optimised for phone/tablet use, hence why they outperform ARM which remains a general purpose processor that can function in IoT, Automotive, Industrial etc embedded systems not just Android. Apple has developed a bespoke ARM which is optimised for iPhone/iPad not much else. Not sure if taking the same route with M1 for computational use is going to yield the expected result everywhere. Definitely appreciate that it is much less power-hungry but Intel is still on 10nm SOS. When they get to 5nm like M1 (and SOS will always mean they are a process node behind TSMC and their bulk CMOS) then there will be a like-for-like (almost said apples vs apples :) ) comparison.
Deckard 5 Pegasus
Deckard 5 Pegasus Mês atrás
@higgs ocnyou're full of it.
higgs ocn
higgs ocn Mês atrás
@Deckard 5 Pegasus > Actually it does not even matter now. Intel is so far behind everyone, they alreadly lost. Even in the CISC architecture AMD is killing Intel. And in RISC, Intel is in the stone age with absolutely nothing to compete with any ARM, let alone an Apple M1 chip. Try to educate yourself at least a little bit before posting nonsense. Learning the differences between CISC, RISC and ARM, e.g. hardware memory models, would be a good start. Learning the history of Intel would be a good start too before claiming them dead. Read "Only the Paranoid Survive" by Andy Grove.
higgs ocn
higgs ocn Mês atrás
Unfortunately, this video really misinterprets the history of computer chips designs by starting it with Intel and ARM =(
Deckard 5 Pegasus
Deckard 5 Pegasus 2 meses atrás
There are so many problems with this comment. "The Apple A12 etc aren't really ARM any more" - There are no true "ARM" chips. ARM only licenses out the IP and doesn't fabricate, nor design any chips, and all 3rd party manufactures make there own version, with their own additional enhanced add on instructions sets. Thus A12 is as much an ARM chip as anyother chip claiming to be "ARM". "ARM which remains a general purpose processor that can function in IoT," - Again there are NO "general purpose" ARM chips. Every single manufacturer of "ARM" chips is making their own custom version. "ARM which is optimised for iPhone/iPad " - As written is not correct. More correctly said It is optimized for the iOS and the OSX software. This is exactly the same with Microsoft Windows and the drivers made for Windows systems because of closed door agreements. And why much hardware is better supported and works with less gitches on Windows than on Linux. This statement applies to all of the IT industry, and really says nothing new. "When they get to 5nm like M1...then there will be a like-for-like comparison" - Actually it does not even matter now. Intel is so far behind everyone, they alreadly lost. Even in the CISC architecture AMD is killing Intel. And in RISC, Intel is in the stone age with absolutely nothing to compete with any ARM, let alone an Apple M1 chip.
Shawn G
Shawn G Anos atrás
Very cool video. I've been thinking about a Mac for some time now, but held off mainly due to price and not feeling like they were worth it... Well, then Apple dropped the M1 bomb, and I still held off... Then I used my dinosaur laptop the other day and it literally took 10 minutes just to boot up. Doing anything on it was PAINFULLY slow... Got a M1 Mac Air last night and I'm pretty excited to learn this new system and also know that I have one of best computers on market that'll probably last me 10-15 years lol... To witness the screen turn on BEFORE I even fully open the computer and then do Touch ID and I'm literally at my home screen ready to do whatever WITHIN a second is pretty f'ing mind blowing to me!
Roger Knights
Roger Knights Mês atrás
Once you bite (the apple), you’re bitten (entranced).
Dataflow Geometry
Dataflow Geometry 7 meses atrás
One of the mistakes Microsoft made with Windows was to overload the start-up sequence with too much unnecessary housekeeping. When Apple adopted BSD UNIX for their OS kernel, they did just the opposite....keep the launch sequence uncluttered -- get the desktop up & running in as few seconds as possible.
daniel_960_ 2 anos atrás
TSMC also deserves some credit. Without them the ARM and AMD chips couldn’t dominate as much.
Maxielle Villegas
Maxielle Villegas 8 meses atrás
Engineers behind all grindings can only watch from the side.. Shout out to all my co-physical and electronic chip designers all over the world.. Litterally sweat and blood.. And TSMC making all chips materialize.. Hoping the China invasion won't ever happen, but as many says.. No one knows the future..
Louis Chege
Louis Chege 8 meses atrás
@pjoh7 no
Subhajit Chatterjee
Subhajit Chatterjee 10 meses atrás
@Aleksandar Makedonski No its not like that. They can shift to GF if they want.
Aleksandar Makedonski
Aleksandar Makedonski 10 meses atrás
@Subhajit Chatterjee behind samsung apple cant made any cpu they biy finised products feom tmsc and selling like apple silicone 😂
ruzzell907 4 meses atrás
13:41 Not sure if this was an error, but wasn’t it 2010 with the introduction of the A4 processor when Apple made their own custom processor. 14:22 And the first ARM 64 bit processors came to Android around early 2015 - it was the Snapdragon 610 and 710. For Samsung, their first 64 bit phone was the Galaxy Note 4 and Note Edge using the Exynos 7 in late 2014. Not the Galaxy S4.
Mobile Filmmaking
Mobile Filmmaking 11 meses atrás
I've used nothing but Macs the majority of my life and I love my M1 Mac. More so, I love the fact that I didn't have to sell a kidney to be able to afford one.
Test With Chris
Test With Chris Anos atrás
Great production telling the story of these chips. I'm excited about M1!
Masterlehand 9 meses atrás
Great analysis! It is cristal clear that RISC is the future. I can´t wait to build my first ARM based desktop PC
Vijay Kanth
Vijay Kanth 7 meses atrás
Top class research, after watching the full video I truly felt that this should be a lesson in electronics class
Partha Dey
Partha Dey 2 anos atrás
I can imagine the very tense high level meetings that are happening at Intel right now 😂
Nauris2111 Anos atrás
Intel's neweest Desktop chips stomp M1 into the ground in terms of computational power and efficiency. Their upcoming mobile chips won't be too far behind, I can assure you about that.
MrMrbobby Anos atrás
Especially now. New chips are mad
Ethan Hsu
Ethan Hsu Anos atrás
@TheLonelyOwl 265 they still are very much at the top and have an enormous amount of capital. If they get an efficient chip out in time, I’d argue they can stay at the top of the market. Apple doesn’t care about competing with intel since Apple will never build a chip for windows, so intel still has a chance.
Nigga Master
Nigga Master Anos atrás
Ibm has already 2nm chip so they dint ha e to worry lol
Debt Collector
Debt Collector Anos atrás
meow meow meow
gulfstream72 Anos atrás
An excellent video presented in a captivating format; brilliantly explained. I'm instantly a subscriber and now know why I'm such an apple nut! Well researched, well explained; Everyone with any interest in computers and mobile technology can stands to gain from your vids. Thanks
Benjamin D.
Benjamin D. Anos atrás
The interesting thing is almost all Macs and iPhones / iPads sold today have the M1 or Axx chips. In fact, I only own one Mac with an Intel. My iPhone uses the A12 Bionic, my iPad uses the M1, and my 24” 4.5K iMac also uses the M1 chip. Apple made chips are definitely working hard to force Intel to eat their dust.
Darkside Johnny
Darkside Johnny 7 meses atrás
I know of one difference in architecture between the early Motorola (used by Apple) and Intel was the Intel chip had to outsource the math function to an external chip, then wait for the answer. This was time consuming and generated extra heat. The Motorola (Mac) chip had this function built-in. Just this one difference made the Mac graphics way better than Intel's. I also heard that originally, IBM wanted to use the Motorola chip in their first PC release but Motorola couldn't manufacture enough chips to satisfy the order, but Intel could. Can you imagine how that would have changed history?
Netkiller3714 5 meses atrás
ARM chips being able to run off residual power really goes to show how much power the cell transmitter and the screen take up.
Grim Termite
Grim Termite 4 meses atrás
No the arm chips in smartphones do not run off “residual power” only that early prototype. Additionally cold fusion was incorrect saying the arm chip ran of “residual power” it was running of the power from the instructions being sent to it (which is still incredibly mind blowing)
Johannes Paulsen
Johannes Paulsen Anos atrás
Very interesting presentation. I bought my first 486 100MHz Intel PC with 2MB of RAM in 1990 or about that time. A few years later I replaced that PC with a Intel Pentium 75MHz based PC with 4MB of RAM. Today, I have a Dell PC with a 3rd generation i5 Intel processor and 16GB of RAM, a dedicated AMD graphics card and 1TB mechanical hard drive. The performance was not great, very disappointing to say the least. Then a miracle took place. I cloned the Windows 10 installation from the mechanical hard drive to a Micron Crucial 3D nand SSD. The performance increase was dramatic proving that the bottleneck was not the aged i5 processor, but the mechanical hard drive.
RWL2012 9 meses atrás
yeah, SSDs should've been fitted to PCs a lot earlier than they were, like as far back as 2009 or so, shame they were so expensive for long after that.
sci fifan
sci fifan 2 anos atrás
I used to write ARM assembly 20 years ago, making my software about 5 times faster than using regular C++ compilers at the time. Seems like this is a skill I might pick up again for even more screaming performance. The ARM instruction set is one of the most beautiful instruction sets I ever encountered on a CPU. So much nicer than Intel 386+, and even more elegant than Motorola 680x0. it's a great future ahead for computing!
Tarun Arya
Tarun Arya Anos atrás
@Domenic Keller I think you meant the comment for sci fifaan. I think compilers are a good thing, and lots of projects would be too difficult and big to manage in assembly, never mind cope with vast amount of hardware out there. Even programming in VBA can seem overly tedious. Role on AI. There is a role for selectively doing some parts of programs in pure assembly depending on the speed of operation and bottlenecks faced.
Domenic Keller
Domenic Keller Anos atrás
@Tarun Arya this is not the whole truth. C compilers will take the CPU Architektur into account while you have to be extremely knowledgeable and have to spend a shit tone of time to get to that point. Your average programmer has no chance to get close to a good c compiler.
Major Calibere
Major Calibere Anos atrás
Well of course no widely-produced Instruction Set Architecture (ISA) is as ugly and outright *klugey* as Intel's x86, but... is even the latest ARM ISA as sweet, and so completely *orthogonal,* as that of the ADSP-2100 family? Admittedly a fixed-point DSP, NOT a general purpose CPU, it can be used as a microcontroller in many cases, and almost makes you want to eschew the C compiler and do it all in assembly.
minastaros Anos atrás
I had done a bit C64 assembly back in the days, and a bit AVR 8-bit code during studies. But when I came to my first ARM7 project, I remember that I was also stunned by the elegance of the instructions, e.g. the shifting bits. I felt that this was a completely higher philosophy of thinking, and I liked their approach from the first moment. Now I have just bought my first Apple Computer, and as a Linux guy, this was entirely because it's just great technology.
Tarun Arya
Tarun Arya Anos atrás
Assembly was always screamingly fast be it ARM or any other. There always has been nothing as good as direct assembly code. It has just been less necessary as the chips got faster and there was more memory for lots of everyday applications. It has also been complicated by needing to have your software allow for different peoples machines with different software and hardware setups especially where there are multiple manufacturers -ie non-apple. But I agree re native power being unleashed by assembly.
Levi Anos atrás
This is my second time watching this video. I just remembered the product placement of a Qualcomm Snapdragon processor running the Jaeger in one of my favorite movies, Pacific Rim. In hindsight, that scene would have been cooler if they used the M1 (which, to be fair, wasn’t around then).
CAR TALK UK 9 meses atrás
Windows and Android user hear and the M1 chip is on a different level but bought my wife a I pad air M1 chip and I was so impressed I bought myself one it's amazing having a tablet that has the performances of a £2k pc for £900 and its the best for cheap video editing... And it was only after watching this video it got me interested and it genuinely works I can't believe how fast the M1 chip is ...
JonNewsh 10 meses atrás
It’s a real unfortunate situation that it’s Apple at the forefront of this revolution. But having said that I think it’s very safe to say that all big companies (tech or otherwise) have evil business practices… so what can we do….
Gumnaam Aadmi
Gumnaam Aadmi Anos atrás
Even for everyday tasks like surfing the net, preparing presentations or sending work emails to colleagues and clients, the M1 shines. This is a chip for the everyman as much as it is for programmers and data scientists. I am very glad I bought the MacBook Pro in 2021.
Rama Chandran
Rama Chandran Anos atrás
It reminds me of how technology, like the evolution of the species, follows more and more divergent yet towards the common desire to create more efficient and/or more capable versions of the species. That's why we need constant competition and the "DARWIN MINDEDNESS" to make the most of our "finite" resources
Xen0gears515 Anos atrás
This is why its important to have competition. Intel has been at the top of the food chain for many years but recently AMD has now caught up and you can even argue that they're more performant/efficient. And then there's the ARM chipset, spanking both chips. I'm pretty sure Intel is shitting their pants right now
Pulkit Gandhi
Pulkit Gandhi 5 meses atrás
uhh after some time intel won't have money to buy pants, i mean someone give intel the world record for making chips the hottest chips.
Ben Neeley
Ben Neeley 5 meses atrás
AMD has competed with intel for years, and since the previous gen they have been the top choice for most users. Intel really got complacent over the years and it shows.
RandomUser 5 meses atrás
@Evacody124 stuck in thinking of the past
Random One for Stuff
Random One for Stuff 6 meses atrás
@Evacody124 I am not talking about whether or not Apple is competing with intel or AMD, or in fact, if they are competing with anyone at all. I was just saying that, when I commented, Apple had created the fastest consumer CPU in the world ever made. The fact of it running MacOS is irrelevant. But, for the record, you can run Windows in a Parallels or VMWare Fusion virtual machine and get close to native speeds. I'm really not sure what you're trying to say with "Apple will never catch up to Microsoft or intel or AMD or Nvidia with what they are doing." Catching up in what sense? Because if we're talking about the speed from optimization of software and hardware, Apple destroys all of those companies. Literally a DaVinci Resolve video export is faster on an M1 Ultra Mac Studio than on a tricked-out RTX 3090 + intel core i9 desktop Windows PC.
Evacody124 6 meses atrás
@Random One for Stuff 🤦‍♂️ dude just no Apple is no competition for Windows nor is it to Intel or AMD. Apple is the only company that makes the computers that run MacOS. Name me a nother company that builds computers that runs MacOS. I can't going to a Best Buy or Microcenter and by there M1 chip to put in a computer. You can with Intel and AMD. Apple will never catchup to Microsoft or Intel or AMD or Nvidia wight what they are doing. Apple is a niche market in the computer world.
Unusual Attitudes
Unusual Attitudes 11 meses atrás
That was a REALLY well put together video! Great job, & thanks for teaching me a few of the “in the weeds” details.
Kasper Bødker
Kasper Bødker Anos atrás
I believe that Qualcomm will be a huge player in the laptops space in the future.
Alan Tan
Alan Tan 11 meses atrás
The M1 mac mini is one of the best macs I have used in a long time. It just handles everything with ease. The next generation will be even more power efficient and powerful. The industry is changed forever
Sauvik Roy
Sauvik Roy Mês atrás
ARM hasn't been much favourite for server or enterprise architecture. Maybe with it's RISC set, it gets fast and all in some consumer applications, but I am not sure how it might stack up to classes of instruction sets on the x86s. Maybe there's a full overhaul of x86 based systems but it wouldn't happen in a day. Why doesn't apple make servers?
Andrew Dwyer
Andrew Dwyer Anos atrás
This has convinced me to look at the ARM (vs. x86) for my next laptop.
J H 2 anos atrás
I've been a PC/Android fan all my life, and I take my hat off to Apple for achieving this.
mike mcmike
mike mcmike 2 meses atrás
@Loading Forever LOL what?!?!??!?)!?) hackintoshes were notoriously difficult. And now with the m1 you can get more power for cheaper buying apple than making a hackintosh. You don’t know what you are talking about
mike mcmike
mike mcmike 2 meses atrás
@John Smith you would have been 100% before the M1. Who thought apple would be the best bang for your buck and sell an overpowered beast
Th3RadLad_ 5 meses atrás
@petruk resing not really, they’re reasonably priced
MrDjBigZ Anos atrás
Don't do that
Rajat Anos atrás
@Rob Ch. 10 months later I see this. While sq2 itself was quite not up to what I expected, Qualcomm do have a plan to catch up. More competition is only better.
Twix McRaider
Twix McRaider Anos atrás
The problem with the ARM system is that you currently dont have the software to use these more power.
Lucas Murad
Lucas Murad Anos atrás
Best cost benefit in terms of computers at least for me. 100% Approved. Nice design, fast, smooth use, battery lasts more than 15 hours working. So practical as a phone. You take it from the bag and just start using from where you were, with absolutely no lag. It's really open and use. Nice nice nice
Craig Best
Craig Best Anos atrás
I'm a long time Apple user and over the last year now have 3 devices with the M1, MacBook Pro, iMac and iPad Pro. Can't wait to see what 2022 brings with rumours of the M2 in a few months.
Max In Saigon
Max In Saigon Anos atrás
12:18 I should also point out that he was the first Intel CEO that wasn't an engineer, he had an MBA. And you know that was the beginning of their downfall.
Windows Xp Wallpaper
As I would like to see ARM more common in PC world, It also kinda scares me, because ARM devices seem kinda unable to upgrade, with their built-in to motherboard, stroage and memory.
Anan Tripathi
Anan Tripathi 2 anos atrás
This single video is far better than all the other reaction videos combined.It gives more insight and in depth analysis of what actually happened how it happened.
virt1one 9 meses atrás
I would have liked a deeper dive into the M1 design such as integrated ram and its other integrated parts, but I suppose that would have gone a bit off-topic. Overall, a very well-done video, thank you!
Bikepacker 10 meses atrás
You are awesome at what you do. Thank you for these videos.
Omar Anos atrás
I started using ARM based Oracle Linux last year and I'm very impressed with the performance! ARM is the future....., probably!
Stephanie Romer
Stephanie Romer Anos atrás
Ya, and I decided my desktop days were over, and moved to California in mid-November 2019, after buying a new laptop with only battery power and processing power as my main criteria… about a month later, I bought a new IPhone. Then by November 2020, I moved to Silicon Valley. 😂🤷‍♀️ Seems like a no-brainer to buy an Apple laptop next… ❤️‍🔥 It has exactly what I wanted. If your laptop runs out of battery, then it’s not a laptop anymore. If it needs tons of fans it’s big noisy and hard to put on your lap. They solved all these problems at once. 🤔🧘‍♀️ This is my second comment. My first was 9 months ago. Safe to say I can’t stop talking about it. 🗣 I finally got to legit use that emoji!
Kevin Borg
Kevin Borg 7 meses atrás
Hey Dagogo, I am thoroughly enjoying yr channel, I love the new knowledge you are giving me, and in such an enjoyable and captivating way, thank you Dagogo. Cheers Kev
Mark Davenport
Mark Davenport 2 anos atrás
“As we set about designing the ARM, we didn’t really expect.. to pull it off.” 🤣 what a legend.
Rarefaction 9 meses atrás
@RWL2012 👏👏
RWL2012 9 meses atrás
@Rarefaction *They don't mention Sophie Wilson who was the real brains behind ARM architecture.
Pioneerz 2 anos atrás
@Gregory Malchuk No fancy pipeline either, but we did think about it doing it. We had 24 instructions in total; 6 of them being load and store instructions, 9 data manipulation instructions and 9 Jump and branch instructions I really recommend this YT series it's very very very long(150 or so videos). But you will come out of it with a complete understanding of Processor design: Otherwise, the books we used for our project was: FPGA protyping by VHDL examples by Pong P. Chu. (ISBN: 0-13-148521-0) Computer Systems - A Programmer's Perspective by Randal E. Bryant, David R. O'Hallaron (ISBN: 1-292-10176-8) Structured Computer Organization by Andrew S. Tanenbaum (ISBN:0-13-148521-0) Introduction to Logic Circuit and Logic Design with VHDL by Brock J. LaMeres (ISBN: 978-3-319-34195-8)
Gregory Malchuk
Gregory Malchuk 2 anos atrás
@Pioneerz Ahh. What was total number of instructions? Was it pipelined? Do you have any recommended books on the subject?
Pioneerz 2 anos atrás
@Gregory Malchuk Indeed, but it’s not computer science that I study. It’s Electronics engineering; it was to go along with the theme of our 4th semester. which was digital design coming from 3rd being about analog. So we got an FPGA and our fingers into Verilog and programmed our own custom CPU with our own very very simple instruction set. With instructions as: add, subtract, fetch data from address in memory, and a few more. No interrupts though, but it could be implemented if we had more time. Learned a lot from that project, but that cpu design in essence wasn’t much different from CPUs in the 80s. Today’s processors got all types of fancy things like branch predicting and parallel processing and FPUs(ours only had an ALU) and more that I haven’t looked that much into yet😊
MrScriptX 8 meses atrás
Gets me even more excited for RISC-V. Intel has been investing in it. I think we will see great competition and innovation in the coming years. (Also lot of works to adapt everything to the new arch ^^'). I'm already crying in low level.
Yashojit Bhaumik
Yashojit Bhaumik 3 meses atrás
Can you make a video of what would happen if AMD and Intel also moved to ARM, or if it is even possible?
Faolan Anos atrás
I was surprised no mention of Apple's Newton came u, and the importance of ARM in the PDA market of the 90s.
Simon Lasnier
Simon Lasnier Anos atrás
The way I see it, clearly the biggest innovation was made by the ARM guys, a couple of years too early. Apple was just smart enough to use it :)
JJV Anos atrás
Not really, the were windows laptops that used ARM chips from Qualcomm before M1. Also Raspberry Pi runs linux on a smartphone ARM chip. None of them were a really desktop worthy chip and were not great performers. It was apple silicon team that has been able to do a great viable chip first. They hired all the best engineers and made something that blows away existing laptops. Qualcomm Snapdragon will get there soon, they are always a couple of years behind apple.
Demby Abella
Demby Abella Anos atrás
I bought my M1 Pro 16-inch 16GB 1TB 3 days ago, and it is INSANE. I'd say 5X improvement over my 2018 15-inch Macbook Pro. Battery life - insane Processing speed - insane
Kyudo Kun
Kyudo Kun 2 anos atrás
I think it's safe for Apple to include stickers in their MacBooks with "Intel not inside"
MrAdopado Anos atrás
@klo schuessel ...which is why you don't get them on Apple products even if they do have Intel inside..
young gull kim
young gull kim Anos atrás
ha ha
Nobody 2 anos atrás
And I think it's safe for you to copy this comment without the original writer know about this
Caio ACO
Caio ACO 2 anos atrás
@Zoura hey definitely can power potatoes very well.
klo schuessel
klo schuessel 2 anos atrás
These stickers... i hate them
PC Gamer
PC Gamer Anos atrás
The real question to ask is how did Intel get so lazy for so long?
Rodolfo Netto
Rodolfo Netto Anos atrás
Kudos for showing Gary Kildall! That guy is the person who showed how a 'operating system' could enable software to run on different machines. OS between quotation marks because Kildall never said his CP/M was an OS. Another kudos for showing the Computerphile interview!!! In 1992 I used as an undergraduate student some SUN workstations with X-Windows running on RISC chips.
Gerald Pasion
Gerald Pasion Anos atrás
Watching this on iphone 13PM.. i remember back in the the day you were pushing note 2 as a pc and i was pretty convinced it was the most awesome thing! Now ive been editing/rescaling 4k 60fps in this very device at ease.. :) m1 and A15 are crazy!!!
damnation333 17 dias atrás
Interesting story. Thank you. I would like to hear how AMD fits into this story as well.
Tim Singleton
Tim Singleton Anos atrás
Yup, I always wanted, and lusted for, a 486DX100. Wound up skipping it altogether, but sometimes I think I might hunt around and see if I can scrounge one up somewhere. Nostalgia.
Jozsef Sorger
Jozsef Sorger 2 anos atrás
I lift my hat for your professional. You provide a content which is very interesting and the way how you present it is very academic and professional. There should be more content like yours nowadays. Thank you Mr Altraide
Hillary Amerman
Hillary Amerman 2 anos atrás
You can say that Apple took the RISC.
scabthecat 2 anos atrás
I agree. This seems like documentary making rather than a You Tube tech review.
misterPAINMAKER Anos atrás
Maybe Intel must release a new X99 Architecture that will also be compatible with the previous X86 architecture & ARM arch, and also keep the license open source, so we can have the same apps on mobile and desktop.
Chris Roberts
Chris Roberts Anos atrás
Where do you source all your research from? You level of detail in your video's is awesome!! Thanks, and keep up the great work!!
Rajey Shah
Rajey Shah Anos atrás
I kinda had this idea way back in 2016... What if we put Snapdragon 800 series chips which are already pretty powerful and capable of mostly all types of programs and apps we need in a huge laptop frame...?
Chris 3 meses atrás
This video is so great and detailed. Thank you so much! :) Great value!
chris hooge
chris hooge Anos atrás
RISC dominated data and graphics processing right up until the 2000's. At one point every major gaming console was using IBM's RISC processor. In fact, the CISC chips only became ascendant as Microsoft adopted the IBM PC architecture. The creation of Linux really put a damper on the RISC architecture. Intel CISC processors had become a commodity and Sun Microsystems and IBM had treated their RISC procs as high-end, high-cost offerings. It's nice to see the RISC model having a comeback.
Min 2 anos atrás
Intel's been lazy for awhile now. Props to Apple! 👏👏
MrDjBigZ Anos atrás
@Russell not really, Apple optimization only for Apple and AMD and Nvidia has to optimization for every newer PC knew to humankind
geroutathat Anos atrás
Actually Intel has been very active, they stopped microsoft releasing arm laptops a decade ago, thats right a decade ago Microsoft had arm laptops made by qualcomm working fully. Intel stopped them.
Murat Demirturk
Murat Demirturk Anos atrás
They literally did not add any value to their i7 for a decade and even i9 is just modification of i7. And yes intel is surely doing something wrong. Also if google make their quantum computer available for commercial use. I think it will be end of intel.
Liviu Ganea
Liviu Ganea Anos atrás
@Aman Agarwal Wow, tell me you're stupid without telling me you're stupid. SoC's in phones and tablets (laptops don't use SoCs) also have the memory integrated, and that's just one difference.
techvette Mês atrás
I'm waiting for the Apple/ARM version of the Atrix. Your computer is in your pocket. Plug it up to a display and a keyboard to get some work done. As an engineer on a team with the word "performance" in the title, I wouldn't have thought that was possible a few years ago.
Lionel Rodriguez
Lionel Rodriguez Anos atrás
I congratulate you for developing a micro so powerful that I will never buy due to its closed license, cheers
Danny Scheers
Danny Scheers Anos atrás
Does anyone remembers the Atari Transputer? I saw that run in 1988 at the Cebit Messe. ( Based on basically RISC chips, a machine full of farmcards did things back then that rivaled some modern video cards...
Prince Banini
Prince Banini 4 meses atrás
this is just breath-takingly beautiful. I love the breakdown and story telling.
Mark Mark
Mark Mark Anos atrás
Watching from a M1 air, best computer experience ever, no noise, insane battery life, insane performance.
CyberOne Anos atrás
I am taking a moment to appreciate the free knowledge that exists on the Internet
VanSet Productions
VanSet Productions 10 meses atrás
You usually pay with being subject to ads, but alas it's sort of free... for now... make sure to educate yourself and others about net neutrality! The fight isn't over yet!
CE Anos atrás
@carso1500 "It isn't really free" followed by "you only need to pay" are "basically the same" statement problem condition. What @DDD _really_ meant to say was, "Nah dude, not free at all!".
Tuck John Porter
Tuck John Porter Anos atrás
*2* *JNostro* Agreed. Me Too. But, that also means the _"Cat's out of the Bag"_ on something new! During my *New* *Wave* Entry Years on *Melrose* Ave as a Costumer, it was absolutely thrilling and exciting to see my Fashion Goods hung in the Boutique Store Windows there as I drove by. But this also meant my cool fashion idea was set loose and up for grabs to knock-off artists going forward. Thus I'd return home and rapidly start working on designing the next best thing. _".....ooh..wait..wait.. I just fired up an Opera Browser. Apple just announced M2 in a surprised Twit to Mcrosoft! (just kidding, but you know how this shlit goes)."_
11WicToR11 Anos atrás
totally not free ...sheeple pay by watching ads, while the rest of us are being manipulated by targeted video suggestions, that is something that is impossible to put label on but totally can change where countries are heading.
Patrick Lauge
Patrick Lauge Anos atrás
Thanks for a really good and exciting video. Have seen it 2/3 times and really thought it was crazy what Apple has done and how the technology has evolved.
Indsofin Anos atrás
The main issue with ARM in the past, is that it required "way more memory" than the CISC CPUs, and memory was really expensive. CISC CPUs required less memory, so it made them better (and cheaper). But as technology got better, prices for memory went down, and opened the door to the possibility of using RISC CPUs (such as ARM) more freely, to the point that today, when memory "is not important", it can be implemented. I never liked Apple in general. Besides the Ipod, I've always thought it was an overpricing company with not that much innovation. Until they came with the M1. The new M1 may be super powerful, but not that useful. And don't get me wrong. I think it won't be that useful because of the other side of the coin: The software. The problem with disruptive technologies is that the market needs a time to adapt to it. And that's why I think the M1 may be an "undermined" CPU. But I expect that by the time of the M2 or M3, the world will have adapted and we'll be dropping the CISC system (x86-64, AMD64). Lastly, next disruptive change I guess will most likely be quantum computing CPUs.
Dan One
Dan One 8 meses atrás
for some reason I just don't care about M1 and keep buying intel's desktop. The major innovation is they integrated RAM into the CPU.
Osita Anisiobi
Osita Anisiobi 10 meses atrás
Solid statements. But you lost me at quantum CPU they require way too much error correction and have to operate in heavily controlled environments. Anything as much as a bump of the table could throw calculations off completely. Quantum computing will become really useful in about 20 years
TheGothGaming 10 meses atrás
well to be fair, Apple has disrupted markets in the past. they disrupted the music industry with the ipod and itunes (making buying music disks obsolete), then they disrupted the mobile market with the iphone. so they do have their credit in innovation. they are about to disrupt the market yet again, this time the CPU market. in 5 years from now we will see most laptops and desktops using ARM
M1 - How Apple DESTROYED Intel i5
Visualizações 648 000
Why Apple's M1 Chip is So Fast
Visualizações 193 000
My Channel Was Deleted Last Night
Visualizações 5 246 080
How do vinyl records hold stereo sound?
How Chip Giant AMD Finally Caught Intel
The FTX Disaster is Deeper Than you Think
Enron - The Biggest Fraud in History
Visualizações 6 900 000
Eu vou descobrir QUEM FEZ ISSO!!
Visualizações 99 593
😂😂😂 LeoNata family #shorts TikTok
Eu vou descobrir QUEM FEZ ISSO!!
Visualizações 99 557
Testei o Ciúmes Do Meu Namorado!
Visualizações 188 926