Escuro

The Race to Build a Perfect Computer Chip 

Bloomberg Originals
Inscrever-se 3,8 mi
Visualizações 758 mil
50% 1

Digital activity uses a huge amount of electricity with semiconductors near the limit of their efficiency. Now scientists are racing to perfect new chips that use much less power and handle much more data.
#thespark #technology #green
--------
Like this video? Subscribe: brvid.net?sub_...
Become a Quicktake Member for exclusive perks: brvid.netjoin
Subscribe to Quicktake Explained: bit.ly/3iERrup
QuickTake Originals is Bloomberg's official premium video channel. We bring you insights and analysis from business, science, and technology experts who are shaping our future. We’re home to Hello World, Giant Leap, Storylines, and the series powering CityLab, Bloomberg Businessweek, Bloomberg Green, and much more.
Subscribe for business news, but not as you've known it: exclusive interviews, fascinating profiles, data-driven analysis, and the latest in tech innovation from around the world.
Visit our partner channel QuickTake News for breaking global news and insight in an instant.

Publicado em

 

9 Nov 2022

Compartilhar:

Compartilhar:

Baixar vídeos:

Carregando o link.....

Adicionar a:

Minha playlist
Assista mais tarde
Comentários : 768   
@firstnamelastname7941
0:00 The importance of developing low-energy computer chips 4:12 Carbon Nanotube Transistors 11:07 Photonics Chips 15:26 Neuromorphic Computing 24:47 Conclusion
@benschulz9140
@benschulz9140 Anos atrás
and yet Michael P. Frank can't get funding for adiabatic chipsets that have reversible computing....
@vikidhillon1884
@vikidhillon1884 Anos atrás
thnx mate
@Kevin_Street
@Kevin_Street Anos atrás
Thank you!
@luka3174
@luka3174 Anos atrás
Thank you brother
@shengtaowang7577
@shengtaowang7577 Anos atrás
thx
@adrielamadi8585
@adrielamadi8585 Anos atrás
I'm a computer scientist specialized in software development but this made me appreciate the hardware side of things. It's really inspiring
@7eVen.si62
@7eVen.si62 Anos atrás
Pr!ck
@freddoflintstono9321
From my perspective this was one of the most interesting quicktakes I've seen. Well assembled and presented.
@thinktoomuchb4028
@thinktoomuchb4028 Anos atrás
I'd heard of these technologies to varying degrees, but this piece on the current progress of all of them was informative and fascinating. Thank you!
@rawallon
@rawallon Anos atrás
Yeah, also, personally I find it quite odd how I never thought about the carbon footprint of our reliance in computers/tech in general
@Ottee2
@Ottee2 Anos atrás
Fascinating, indeed. Not only do we need to consume energy more efficiently, but also, we need to devise novel ways to create more energy on the planet. Maybe, one day, for example, we will have massive solar energy collectors in space, which then transmit that energy to the planetary surface.
@ko7305
@ko7305 Anos atrás
Epyc.
@notevennelson
@notevennelson Anos atrás
No one asked
@alanhat5252
@alanhat5252 Anos atrás
@@Ottee2 ...without chargrilling intervening birds.
@hyperteleXii
@hyperteleXii Anos atrás
This new chip sounds like a pathfinding co-processor to my game developer ears. Navigating in real-time an order of magnitude more agents in a dynamic world would revolutionize game development. Everybody's stuck on pathfinding. We're still using algorithms from the 1960s.
@shivangsingh2463
@shivangsingh2463 Anos atrás
Just want to thank the team of Bloomberg Quicktake for making these really high quality content for us 🙏🏻♥️
@JonahNelson7
@JonahNelson7 Anos atrás
It’s genuinely great
@ko7305
@ko7305 Anos atrás
Epyc.
@JakeWitmer
@JakeWitmer Anos atrás
Yep. Too bad they're associated with the totalitarian name "Bloomberg." I've met Bloomberg employees before who were rightfully ashamed to be associated with the name...
@AndrewMellor-darkphoton
they said something without saying anything
@ROSUJACOB
@ROSUJACOB Anos atrás
You are welcome Mr.Singhania.
@kayakMike1000
@kayakMike1000 Anos atrás
The greatest enemy of a wonderful technological breakthrough is the advanced technology that works well enough.
@ko7305
@ko7305 Anos atrás
Epyc.
@Typhonnyx
@Typhonnyx Anos atrás
yup the consistency is the death of developement
@khatharrmalkavian3306
Nonsense. We're using all of our existing technology and pouring hundreds of billions of dollars per year into researching new methods.
@mnomadvfx
@mnomadvfx Anos atrás
Well ye, but more than that because the tech that works well enough is the one already receiving most of the investment. If silicon had become untenable 10 years ago we would have been forced to switch faster to something radically different. The same thing is true with NAND flash memory - it's truly terrible for power consumption, and the wear rate of each memory cell is anything but great for long term storage. But because the investment was so deep even the most promising advanced replacement tech has constantly been left to rot even as flash becomes an ever greater power drain on our mobile devices.
@mnomadvfx
@mnomadvfx Anos atrás
@@khatharrmalkavian3306 All the nope. Hundreds of $billions is going into silicon semiconductor logic and all the other standard computer and information tech of the moment. Only a tiny fraction of that amount is going into alternative paths. Quantum dots were predicted to replace CMOS image sensors years ago, but nothing is forthcoming simply because the industry investment in CMOS sensors is too high and QD's are not regarded as enough of a benefit to pull money away from CMOS sensor improvement research. You can create a revolutionary memory tech for not a huge amount of money in a lab like Rice university - but making a chip using it that scales to competitive bit density with a modern 3D NAND chip costs a frickin shipping port full of money to engineer in staff and time, only compounded by the lesser experience and knowledge with the newer technologies in the industry. There are some things that are being pursued more vigorously, such as metalenses - they can be produced faster, more cheaply than conventional lenses and offer dramatically increased compactness + utility as a single achromatic metalens element can replace many in an optical system by focusing all wavelengths onto a sensor in one super thin piece rather than needing one element per wavelength and others for extra adjustment. So they are basically winners across the board relative to the technology they aim to replace. 20 years from now we will wonder why cameras lenses ever used to be so heavy.
@oldgamer856
@oldgamer856 Anos atrás
24:36 Where's the mouse? * Points at camera * What a burn
@nayeb2222
@nayeb2222 Anos atrás
This just renewed my interest in engineering, truly an inspiring documentary.
@atlantic_love
@atlantic_love Anos atrás
I'm sure tomorrow would see a kitty rescue and go "this renewed my faith in humanity", right? You renew your interest is something by doing something.
@nayeb2222
@nayeb2222 Anos atrás
@@atlantic_love nothing renews my faith in humanity, I know it's doomed at this point
@HueghMungus
@HueghMungus Anos atrás
@@nayeb2222 Then be a prime example, and do not have kids, then dissolve your body and donate all your organs to charity. Thanks!
@nayeb2222
@nayeb2222 Anos atrás
@@HueghMungus I do have faith in religion, so it won't be an option for me
@joshuathomas512
@joshuathomas512 Anos atrás
@@nayeb2222 you have faith in fiction
@edeneden97
@edeneden97 Anos atrás
I really like the jumps between the founders and the skeptic guy
@djayjp
@djayjp Anos atrás
Still a lot better than delivering physical goods (eg Blockbuster vs streaming). I'm sure the former uses more than 100x as much energy.
@niveshproag3761
@niveshproag3761 Anos atrás
Yes, delivering physical goods use 100x more energy but the convenience of digital means that we use it 1000x more. Some people just leave netflix/youtube running in the background the whole day.
@djayjp
@djayjp Anos atrás
@@niveshproag3761 I highly doubt that figure.
@niveshproag3761
@niveshproag3761 Anos atrás
I highly doubt the 100x too. I just mean our consumption outpaces the increases in efficiency. Proven by the fact that our electricity consumption increases every decade.
@djayjp
@djayjp Anos atrás
@@niveshproag3761 Certainly not proven as you're not isolating variables.
@florisr9
@florisr9 Anos atrás
Exactly, we should focus on the way digital equipment makes our lives more productive and efficient, rather than how it consumers 'much' energy (it doesn't). The ratio of energy consumption to value added is tremendously small compared to other sectors like transportation.
@omidsaedi5763
@omidsaedi5763 Anos atrás
Such an awesome content that you produce. It has something to teach nearly anybody at any level of knowledge regarding the problem.
@madad0406
@madad0406 Anos atrás
Literally just passed my computer architecture final and then this was recommended to me haha. Great video!
@mdaverde
@mdaverde Anos atrás
What a time to be alive. This is obviously exciting work but the skepticism in me assumes that once vendors see that they can do more with less, they'll just do more and so the cycle continues. What is also brutally hard about this work to take full effect is the next layer above this needs to also change to take advantage of these performance/energy gains (the firmware, instruction sets, the algorithms, programming languages and so on). Over the long term though, I'm optimistic of this shift
@michaeledwards2251
Rust and other modern languages will become more significant. Hardware implementation of typing, inheritance, bounds, soft cells, networks, and flexible control assignment, will all be implemented.
@jackdoesengineering2309
I'm using an APU not a CPU. It's 35 watts and is extremely capable. With computational progress comes reduced power. It's just they are mutually exclusive and at some point people have to choose lower power over performance gains. As energy prices rise this choice is reconsidered.
@michaeledwards2251
@@jackdoesengineering2309 Under clocking tricks allow the computational rate to be reduced to the demand rate with lower power consumption. This still allows high computational rates, with high power consumption, whenever they are needed.
@amosbatto3051
@amosbatto3051 Anos atrás
To some degree the energy consumption hasn't fallen, especially with desktop PCs, but the greater energy efficiency has made possible all sorts of new form factors which are much more energy efficient: laptops, netbooks, tablets, smart phones and smart watches. Eventually we will get to smart glasses, smart earbuds and smart clothes that are extremely energy efficient and can replace much of the functionality of traditional PCs. If you look at energy consumption in advanced economies, it is actually falling, which is an indication that we are doing more with less energy. As a computer programmer, I can tell you that energy efficiency is becoming increasingly important in programming. Not only are programmers focusing more on code that can deal with low energy systems running on a battery, but they are focusing more on compiled languages, such as Rust, Swift, Go and Julia, that use less memory and computing cycles than interpreted languages.
@boptillyouflop
@boptillyouflop Anos atrás
@@michaeledwards2251 Hardware implementation of typing, inheritance and bounds as of yet hasn't been able to make any of these things faster for code that does these things as of yet: - Inheritance is basically a fancy jump instruction. The main problem with this is that with inheritance, your jump address usually has to be loaded from memory, which can take many cycles, and the CPU has to basically guess the branch target and run a whole bunch of speculative instructions while the address loads for real. Having a special version of "jump to variable address" just for inheritance just doesn't gain much over the regular variable jump. - Bounds is likewise a fancy conditional branch. Conditional branches that rarely get taken are already quite cheap on modern CPUs - they do take up slots in the instruction decoder and micro-op execution but they don't compete for the really important slots (memory loading/storing). In fact, loading the bound is definitely slower than testing it (since it uses a memory load instruction). The speed gain from adding hardware bounding tests is likely to be rather small. - Typing is in a similar situation. Usually dynamic typed variables are either just dynamic versions of small fixed-size static types (double, float, bool, int32_t, int64_t) or larger dynamic-sized variable types (strings, objects, maps, etc). The larger dynamic-sized types have to be handled in software (too complex for hardware), so you'd still have to load the type and test for it. The small fixed-size types could conceivably be handled in hardware but you'd probably just be using the largest type all the time.
@pterandon
@pterandon Anos atrás
Superb presentation. Both “pop culture” exposure, and real technical info for experts
@AlanTheBeast100
@AlanTheBeast100 Anos atrás
I heat my house with electricity for nearly 6 months per year. Therefore 100% of electronics use in the house is 0 energy consuming during that time. Data centres should be located where there is a need for heating water or other materials. Thus all of the heat can be dumped in the early part of the manufacturing process.
@paulspvk6049
@paulspvk6049 Anos atrás
You're correct about the second part. But unless you're using a space heater instead of an HVAC, it's not zero. Heat pumps which move thermal energy are actually more efficient than pure electric to heat converters.
@AlanTheBeast100
@AlanTheBeast100 Anos atrás
@@paulspvk6049 Regardless of how I heat, the heat from all things dumps into the house - so no extra charge ($). As to heat pumps: True enough, but it's cold here (-24°C presently, -30°C tomorrow - will be mild next week) and electricity is very, very cheap - whereas heat pumps are expensive to buy and install - and fail expensively. That said, Hydro Quebec will subsidize about $1000 if I throw a heat pump at it. Maybe some day.
@paulspvk6049
@paulspvk6049 Anos atrás
@@AlanTheBeast100 Makes sense.
@RB747domme
@RB747domme Anos atrás
Fascinating, as Spock would say. It's an interesting decade or two ahead as we grapple with these different technologies, in the hope that one of them will become commercially mainstream - and breathe new life into the industry for another 50 years or so until something newer more more radical appears. Ica is also fascinating, especially it's neuromorphic learning paradigm, and will definitely accelerate the rate at which robot can learn their surroundings and interract, as well as learn from it's past, and build on its intelligence. The future is definitely bright.
@johnsavard7583
@johnsavard7583 Anos atrás
We've already made amazing strides in the power efficiency of computers. An IBM 360/195, with cache and out-of-order execution, like most modern computers, used much more power. And go back to the days when computers used vacuum tubes instead of transistors, and their power consumption compared to the work they could do was much higher.
@ko7305
@ko7305 Anos atrás
Epyc.
@mnomadvfx
@mnomadvfx Anos atrás
That is true, but back when that happened the worldwide use of computers was just a tiny fraction of what it is now. The increase in use means we need to push the hardware efficiency ever further to keep up.
@0xD1CE
@0xD1CE Anos atrás
Wirth's Law. We don't necessarily need better computers. We need software to be more efficient. Nowadays it's normal for computer programs to occasionally crash due to memory leaks or bug in the code. I work at a datacenter and I have to use this app on my phone to do daily routine inspections and the app crashes when open for too long... It's crazy how tolerant we became of unstable software.
@chi4829
@chi4829 Anos atrás
15:11 The interconnect cables which are devised to mitigate energy consumption challenges in data centers are just simply optical fiber interconnects which are directly plugged to the ASIC. Co-packaged optics technology bridges the gap between electronics and photonics by integrating them on a common platform with photonic chip serving as the first point of contact to the external world.
@siddhantjain243
@siddhantjain243 Anos atrás
Lithography "nm" these days doesn't really means exact no. Ie 5nm doesn't actually mean 5nm manufacturing process
@djayjp
@djayjp Anos atrás
Yeah TSMC state it's more of a marketing term than anything.
@siddhantjain243
@siddhantjain243 Anos atrás
@@djayjp same goes for Samsung & Intel
@coboltger1354
@coboltger1354 Anos atrás
It's all about transistor density! :)
@DementedPingu
@DementedPingu Anos atrás
@@coboltger1354 Isn't it refering to the size of transistor gates?
@mmmmm49513
@mmmmm49513 Anos atrás
It did at one point. But now it’s just used to say it’s 2x better than this old process etc.
@stevesedio1656
@stevesedio1656 Anos atrás
Most of the data that moves around a server farm, goes over copper. Even when computers are paralleled. Light travels through fiber at 65% speed of light, through copper at 60%. The devices that convert data to light have the same limits as the devices that drive wire. Light can send more than one signal using color, but that only uses a small slice of the available bandwidth. Copper wire operates at a lower frequency (maybe 10GHz vs 50,000GHz), but uses the entire bandwith of the wire. The big advantage fiber has is how far a signal can travel.
@vrajpatel3139
@vrajpatel3139 Anos atrás
neuro computer looked fired🔥
@CoreyChambersLA
@CoreyChambersLA Anos atrás
The atom is not the limit to size reduction. Subatomic particles can perform the same functions, better, cheaper and faster.
@ryansumbele3552
@ryansumbele3552 Anos atrás
Very informative, thank you from Cameroon ❤️
@alecs536
@alecs536 Anos atrás
I'm glad that Pam from "The Office" has finally found her calling
@roopkaransingh1794
Amazing content, so cool information, please keep coming up with these kinds of videos
@BradenLehman
@BradenLehman Anos atrás
24:19 "What is this called" *flips off the teacher* 🤣
@sergioespitia7847
@sergioespitia7847 Anos atrás
Definitely. This is a pretty important documentary to inspire engineers all over the world.
@weirdsciencetv4999
Neuromorphic computers are the next key technology. What would be interesting is if chips can be more three dimensional, as opposed to the relatively two dimensional chips afforded by conventional lithography techniques.
@danradnedge
@danradnedge Anos atrás
Could we solve a lot of the energy problem by writing more efficient code? It seems that as processing power has increased developers are less concerned with memory constraints. There is also a lot of pressure to push new features at the expense of optimised code (and of course more and more abstraction layers in coding). It's like Parkinson's law, but with computer memory.
@zonegaming3498
@zonegaming3498 Anos atrás
I could see AI generating new ways to solve computational problems that reduces the need to compute them. For example DLSS or AI upscaling.
@Meleeman011
@Meleeman011 Anos atrás
nope. because in order to make money code needs to be shipped fast. there are better. what we can do is encode more in less. so instead of binary computers we use ternary, or even quarternary computation. and that could increase the amount of possible calculations. the reason developers are less concerned with memory constraints is because its expensive to write efficient code, as it takes longer and you need to understand more math and how computers work in order to write efficient code. its also more prone to bugs and errors. what you need is something simple enough to write but provides enough control for the task at hand. and most people don't even know until the product is shipped, optimizations happen after the product is built. a real solution would be using analog computers and a whole bunch of them to do specific calculations, and then translate them into binary. this in principle is how and why asic mining exists because instead of abusing sand and making it think, we simply just let it read the electrical charge outputs from several mechanical computers and let it process those inputs via conventional silicon which will need less power to operate since it need only read from its mechanical computer counterparts and maybe do a few calculations here and there.
@Meleeman011
@Meleeman011 Anos atrás
the quickest thing you could do is learn how to use linux and a terminal, and you would already be using less power the a majority of people, and use a window manager like i3wm. and use more terminal applications including on your phone like termux. its not as conveniant but you can do quite a bit with a terminal. so much i'm convinced that real work is done in a terminal.
@kingchrome5551
@kingchrome5551 Anos atrás
@@Meleeman011 Using c++ server side can have a reduction in energy usage
@andyfreeze4072
@andyfreeze4072 Anos atrás
@@Meleeman011 mate, i dont want to learn anymore than necessary when it comes to computers. They are supposedly made to adapt to us not us to a binary code. Yes i have done unix and linux before but i gave up, i dont wish to reinvent the wheel, i will let other nerds like you do that. Yes you can, yes i can do this and that but i chose not to. I can think of better things to do with my life.
@gusauriemo
@gusauriemo Anos atrás
Nengo is a software that works with Loihi currently with the intention of allowing software applications for the neuromorphic chips. The research they do at Waterloo in general is quite interesting
@aresmars2003
@aresmars2003 Anos atrás
At least in Minnesota, during winter heating season, I figure waste heat from my PC all goes into heating my home. Of course if I had better home insulation, I'd probably save more in heating that way!
@sergebillault730
@sergebillault730 Anos atrás
The best alignment process I know of is using magnetic fields. Is there a way to make these nano tubes or the environment in which they are stored temporarily magnetic?
@PrinceKumar-hh6yn
@PrinceKumar-hh6yn 5 meses atrás
I am heavily impressed and amazed at the same time the kind of presentation Bloomberg has presented here..PURELY scientific ...
@jaredspencer3304
@jaredspencer3304 Anos atrás
It's got to be tough for these new technologies to compete with Silicon, which has had 50 years of uninterrupted existential growth. Even if a new technology could be better than Silicon, it might never get there because it can't be immediately small enough or fast enough or cheap enough to compete with the cutting edge.
@latvialava6644
@latvialava6644 Anos atrás
New challenges will create New opportunities. Maybe, not with commercial applications but, these technological breakthroughs will initiate their own journey with defence and space Applications !!!
@femiairboy94
@femiairboy94 Anos atrás
They will eventually get cheap enough, the beauty of the capitalist system. 50 years ago owning a computer was impossible. Today the average American has two computers.
@nicolasdujarrier
@nicolasdujarrier Anos atrás
I think a few other options have not been duscussed liked spintronics (with MRAM already on the market), and maybe (flexible) organic electronics…
@anushantony
@anushantony Anos atrás
beautifully put together.
@deiphosant
@deiphosant Anos atrás
People always talk about how more effeciency will lead to less energy consumption. But if I know anything about humans, is that they will always push to limits (and power/thermals are the limiting factor right now), so I feel more effecient chips are just going to lead to even more computers and increased performance instead of decreased power draw.
@RobinOnTour
@RobinOnTour Anos atrás
Energy consumption would increase either way
@trolly4233
@trolly4233 Anos atrás
Can confirm we are running out of chips, the chip-to-air ratio changed from 50-50 to 35-65. Trying times indeed. The bag itself is worth more than the chips inside now.
@NoName-de1fn
@NoName-de1fn Anos atrás
It will be mayhem
@rheung3
@rheung3 Anos atrás
Thanks so much for such beautifully illustrated video about modern economy, and its current limits, and those with courage and wisdom to go beyond, yet at same time hopefully to also start reversing our energy footprint causing climate change disaster.
@spaceprior
@spaceprior Anos atrás
Hey, bloomberg, could you put links to the things you discuss in the video description? I'd expect your viewers to be pretty likely to want to look further into things and read stuff.
@AlexTrusk91
@AlexTrusk91 Anos atrás
I'm glad to own some oldshool household hardware like my cettle and toaster that don't rely on chips and last for like 3 years, but for over 30 years and counting (got the stuff from my parents and maybe i give it to the next generation as some magical relics with of another time)
@vigneshs6232
@vigneshs6232 Anos atrás
Wonderful...Great task....Enormous knowledge...Thankyou all....
@johndawson6057
@johndawson6057 Anos atrás
This was The best explanation i have heard for quantum tunneling. Thanks guys.
@ankitroy3319
@ankitroy3319 Anos atrás
This is really youtube should recommend
@mousatat7392
@mousatat7392 Anos atrás
Even though there is hundreds of company racing in this field, but all of them are pushing the world to the front even if they doesn't win the pie at the end.
@ojhuk
@ojhuk Anos atrás
Photonics are the future. I've been blown away with the ways they are devising to build logic gates that function by altering photons on a quantum level. Light based computers have been a mainstay in Science Fiction for a long time now and it's amazing to see actual real-world advances with practical applications being made.
@koiyujo1543
@koiyujo1543 Anos atrás
Well yea bur Maybe bit it depends we can always make our current election ones better besides trying to add more transistors I mean yea will need better materials like graphene could make computers hundreds of thousands of times faster.
@ojhuk
@ojhuk Anos atrás
@@koiyujo1543 Yeah i agree, there are still advancements to be made in electronics, i imagine hybrid photonic/electronic systems will become a thing before we get any fully photonic chips, from what i understand the benefits of photonics to latency and efficiency go far beyond what is possible with electronics.
@katarn848
@katarn848 Anos atrás
I have my doubt about carbon. After High-NA EUV lithography has reached it limit with silicon wafers. That like in 2 decades. I think there will be limits found how far you can go in complexity , layers and materials.
@billfarley9015
@billfarley9015 Anos atrás
Offhand I can't think of any examples of light-based computers being a mainstay of science fiction. Can you cite any?
@ojhuk
@ojhuk Anos atrás
@@billfarley9015 I can't offhand either. I thought about Data from Star Trek: TNG but he's positronic. I also thought about Voyager's computer but iirc that's organic. There is Orac, The LIberator's supercomputer from Blake's 7, I always assumed that was Photonic but I may be wrong. I'm sure if I was to look hard enough I'd find something soon enough, sci-fi writers have a far greater imagination and scientific knowledge than myself. :)
@_Pickle_Rick_
@_Pickle_Rick_ Anos atrás
So maybe the AI bottleneck (ie. class 5 autonomy, 'general ai', etc) is due to the binary nature of the base layer architecture - from this it sounds like the analogue stochasticity of neuromorphic architectures may be required for AI to meaningfully progress...
@blueguy5588
@blueguy5588 Anos atrás
Some corrections: 1) 5 nm process isn't actually 5 nm, it's a marketing term, so the graphic is inaccurate, 2) modern chips are already layered.
@qwertyali2943
@qwertyali2943 Anos atrás
this really got my attention becuz of my deep interest in science and tech, thanks bloomberg!!
@PrivateSi
@PrivateSi Anos atrás
Nice succinct, informative, up-to-date vid and objective analysis. Photonic computing is definitely the way forward. Neuro-photonic and even bio-photonic computing will combine well in the future when the tech. is worked out. 1000x more computing using 1000x less power within 20 years. Moore's Law will be utterly broken, but in a productive way via a large tech. leap or two, rather than slowing to a standstill as pushed by many youtube vids.
@martinblake2278
@martinblake2278 Anos atrás
I was surprised that Mythic Chip was not included on this video to represent Analog. The Nuero Computing part that is being developed in Mumbai -in this video- has already been created by those guys years ago and currently has a computing power that is equivalent to current digital standards but using only 3 watts of energy.
@I___Am
@I___Am 7 meses atrás
Mythic chip?
@MrChronicpayne
@MrChronicpayne Anos atrás
The guy with the beard is a great commentator/middle man for this Quicktake. Hope to see him again.
@studiolezard
@studiolezard Anos atrás
Wouldn't a high frequency vibration like ultrasound while in suspension help to align the nanotubes?
@ChrisBrengel
@ChrisBrengel 6 meses atrás
First minute does a great job explaining how much electricity computers use.
@FS-ft8ri
@FS-ft8ri Anos atrás
Maybe dielectrophoresis in combination with flow fields in solution is a way of tuning and improving the alignment over pre-treated (for instance lithography) inhomogenous surface energy Si wavers. Worked out pretty well for GaAs nanowires in a study we conducted at the university to align them parallel at contacts.
@ivanlam1304
@ivanlam1304 9 meses atrás
Am I correct in thinking that the aligned nanotubes would form a large scale matrix of potential MOSFET transistors?
@FS-ft8ri
@FS-ft8ri 9 meses atrás
@@ivanlam1304 In principle i think you could manage to make it that way, however, I habe to admit that i am no expert in Transistor technology. My knowledge is more coming from the surface science/electrochemistry/interface science especially solid to liquid
@TheRomanTimesNews
@TheRomanTimesNews Anos atrás
13:00 talk to me boi you got me at photon
@JG_UK
@JG_UK Anos atrás
Amazed how long these alternative silicon methods have been in development. Seems like we’re stuck with wafer silicon for this generation
@RobinOnTour
@RobinOnTour Anos atrás
Lets hope not
@michaelmccoubrey4211
photonic computers, neuromorphic computers, and cpu's that use carbon nanotubes are very intresting but frankly if we wanted to dramatically reduce computer power consumption we could already do this today. We could: - use the programming languages C or Rust instead of popular programming languages like Python (which is something like 45 times less efficient) - use RISC based CPUs such as ARM chips or RISC5 chips - underclock CPUs so that they maximise power efficiency rather trying to maximise performance - use operating system drivers that aim to use minimal power If we did these things we could probably use < 1% of the power we currently use. We don't do these things largely because it would be slightly more inconvenient and would require social change rather than innovations in technology.
@User9681e
@User9681e Anos atrás
There is benefit for higer languages too c is only for low level / high performance stuff and is absolutely unreplaceable there other stuff y need higer languages like rust python java to actually get projects done in time and not mess with optimization too much there is always a use for both About undercolcking the PC has a thing called a governer who decides clock speed per workload so that's already been done when y don't need max performance Plus we have ecores
@rajuaditya1914
@rajuaditya1914 Anos atrás
This is such a normie take that it is hilarious.
@User9681e
@User9681e Anos atrás
@@rajuaditya1914 we all are interested learning and yeah we normies may not understand key concepts
@ericmoulot9148
@ericmoulot9148 Anos atrás
@@rajuaditya1914 Sounds like a reasonable argument to me. Maybe you have some insights to share that'd change my mind and his?
@bhuvaneshs.k638
@bhuvaneshs.k638 Anos atrás
Bruhhhh ur point number 1 doesn't make any sense. U r talking in the software domain. Plus there's a reason why everyone use python. It's faster to build and test
@D-Z321
@D-Z321 Anos atrás
My dad works as an electrical engineer in the semi-conductor industry. Pretty crazy stuff.
@duckmasterflex
@duckmasterflex Anos atrás
reducing energy consumption is like adding more lanes to a highway, it won't reduce traffic, it will just add more cars
@BlackBirdNL
@BlackBirdNL Anos atrás
24:33, "Here is the mouse." Proceeds to point at the Jell-O. Jump cut to it pointing at the mouse.
@drivenbyrage5710
@drivenbyrage5710 Anos atrás
Imagine how much we could save if millions of people weren't living online every waking minute seeking validation, and simply put thier phone down. It would save not only energy, but humanity itself.
@SumitPalTube
@SumitPalTube Anos atrás
Remember, it is these validation seeking individuals which are pushing scientists and engineers to innovate and come with fundamentally new solutions, ensuring that we progress as a species. If we didn't need to upgrade, no one would care to innovate, and we would still be happy with stone-age technology.
@BloodyMobile
@BloodyMobile Anos atrás
I'd like a study on how much % of this power consumption falls on user profiling and the processing needed for it. I wouldn't be surprised if it's around or above half of it...
@JJs_playground
@JJs_playground Anos atrás
This was a great little mini-documentary.
@Wulfcry
@Wulfcry Anos atrás
Reducing environmental impact by choice is said to be used as a performance measure advancing chip design. Could be worst off in no development releasing them, Most designs end up on the shelf without ever being released not even partly small functional designs. All costs go to the larger processing of data while the least of data processing works as well to uncover much of a design. However, I applaud how they go about it.
@bobob2989
@bobob2989 Anos atrás
Anyone know the name of the track that starts around 09:30 mark?
@carl8790
@carl8790 4 meses atrás
@3:32 there should have been a huge asterisk at the figure of 5nm (nanometer). The transistors aren't actually 5nm in size and that '5nm' technology is just advertising for the manufacturing company's next gen transistors. Now due to how transistors can be manufactured and packaged differently, there's no agreed industry standardized size. Some fabs (places where semiconductors are being built), usually give a density figure of x amount of transistors per mm² of their die, but even that is difficult to verify independently.
@grumpeepoo
@grumpeepoo 6 meses atrás
There is already a neuromorphic chip company from Australia called BrainChip with their Akida 2nd gen chip out. They have partnered with ARM, Intel, Prophesee, Megachips to name a few
@shadow-sea
@shadow-sea Anos atrás
absolutely fascinating
@celdur4635
@celdur4635 Anos atrás
I think we will always pump more power even if we have more efficient chips, because there is no limit to what we want to do with them. So cool, more efficient chips, its great, BUT we will still increase your energy consumption.
@jamesjanse3731
@jamesjanse3731 Anos atrás
If the production process always creates metallic nanotubes as a by-product, could those be aligned magnetically before removing them from the semiconducting ones?
@gkess7106
@gkess7106 Anos atrás
Not when they are copper.
@VA-ie4qq
@VA-ie4qq Anos atrás
I appreciate these new insights. Thank you Bloomberg. Truly exciting new developments.
@chuckkottke
@chuckkottke Anos atrás
Well that was fascinating and encouraging! The future belongs to the efficient. 🔵
@gr82moro
@gr82moro 24 dias atrás
leaving processor and memory apart, would it cause too much latency? even signals travel at speed of light.
@shapelessed
@shapelessed Anos atrás
There are datacenters that reuse the heat generated to provide central heating to towns around them. That’s just one example of how much power is wasted on computing - if it’s enough to heat the houses around you and it’s actually even profitable to do that.
@Carfeu
@Carfeu Anos atrás
Amazing report
@deltadom33
@deltadom33 Anos atrás
It is interesting that you can’t combine some of these methods as with neuromorphic computing or neural networks , you could use the efficiency of say photonics but the problems with all these chips is running things like an operating system on top of them as photonics wouldn’t struggle with quantum mechanics problems and as photons are smaller than electrons . The problem is that you have to convert electricity to light using oleds, Neuromorphic computing could solve the carbon nanotube problem but the problem is that you can’t get smaller than the carbon atom which is essentially quite big as it has 14 proton and neutrons. To separate them if it is metallic you would just need a magnet. The thing is to arrange them on a chip would be the most fascinating thing as you could use a neural network to align the chips in the best configuration Light to me is the best option with a neuromorphic setup but to make light travel as it has different wavelengths on the electromagnetic spectrum , But even if you have light together it is still going to generate heat or if you have light transistors . As you could go down to the width of a photon in electronics or photonics , how can you tell whether a light transistor is on or off. If light has mass that would make a difference. You could control light if it was moving at a specific frequency say red shifted. These chips are along way from being used to run games on a computer Quantum computing wasn’t even bought up
@nishantaadi
@nishantaadi Anos atrás
Semiconductor is the world new Gold and Oil.
@anantsky
@anantsky Anos atrás
There's nothing new about semi-conductors.
@organicfarm5524
@organicfarm5524 Anos atrás
semiconductor came in 1930s; they are not new
@zzmmz3789
@zzmmz3789 Anos atrás
But still begging for oil
@robertpearson8546
@robertpearson8546 Anos atrás
You could use q-nary logic like in flash memories. You could use adiabatic circuits. You could use self-time circuits. You don't have to stay in the same rut, doing the same thing over and over.
@TotallyRandomHandle
I'm a science nerd and fan. But I hope there are simultaneous efforts to develop safe disposal of carbon nanotube solution 5:19. The tubes are too small to filter conventionally, and they don't easily degrade. Waste has to be considered, particularly since 33% of it is already known to be unwanted by products (full time conducting nanotubes).
@stevegunderson2392
Putting a computer into a toaster is the dumbest use of computing power one can imagine. Are you so lonely that you need emails from your refrigerator? Security = control.
@maneeshs3876
@maneeshs3876 Anos atrás
Nice video with interesting insights !
@airsofttrooper08
@airsofttrooper08 Anos atrás
25:20 what is the name of this song.
@zAlaska
@zAlaska Anos atrás
The Cerebras wafer 80 EXO scale processor has 850,000 cores, each core itself is a supercomputer. All ultra interconnected without bus speed, it outperforms every supercomputer ever built, all on one chip. They believe they have found the pathway to singularity. I gather the only supercomputer that's faster, doesn't exist. 44,000 watts, perhaps it could two jobs at once, heating water in the room while predicting the future, it's that fast. You know like when you make a movie and it takes forever for the processor to get done with it. Pictures simulating a nuclear explosion, fluid dynamics. Current supercomputers draw the event much slower than it happens in actuality. This chip can do more work faster and predict the event accurately in great detail faster than it can occur. Made that tsmc at 5 nanoscale. Strangely, Moore's law will continue. IBM has already produced chips at the two nanometers, so surely there's lots of room for improvement yet to come for the cerebras wafer supercomputer.
@dekev7503
@dekev7503 Anos atrás
As a microelectronics engineering grad student, I'm very well aware of the major challenges that power optimization can pose. There have been many attempts to "cheat" the physical boundaries of materials, some have been successful, some have lead to entirely different technologies.
@Typhonnyx
@Typhonnyx Anos atrás
really like for example
@boptillyouflop
@boptillyouflop Anos atrás
Any technology that can build a >1Ghz 32bit exact adder...
@yuvrajsingh-gm6zk
@yuvrajsingh-gm6zk 4 meses atrás
keep going!
@mr.g-sez
@mr.g-sez Anos atrás
what about aligning the nanotubes with gravity? would that be possible? centrifugal or electromagnetic and centrifugal
@waynelynch1
@waynelynch1 Anos atrás
Yea, it's an interesting puzzle
@arnoldsujankatru9667
Amazzing Documentation
@malectric
@malectric Anos atrás
I'm in awe of the technology and ideas which have been developed to enable material manipulation at molecular and atomic scales. Just amazing. My choice application of new AI technology: to recognize and edit ads from BRvid videos.
@wesleyolis
@wesleyolis Anos atrás
Interesting enough is there no natural vibration that causes them to aline, like with grains of sand based on the frequency they naturally change there alignment. Could on not tweak tube so at certain frequency, better align to the layout required..??
@DanielRamirez-uw7gu
Very interesting video thank you.
@hgbugalou
@hgbugalou Anos atrás
Software development is also important and is rarely considered in these type of scenarios pertaining to compute efficiency and carbon output. Today's developers are writing bloated inefficient code using high level languages that just add even more overhead. This comes out as wasted CPU/GPU/DPU cycles and thus wasted energy. To some degree the increase in power of the hardware has caused this as before developers had to be much more diligent about writing lean code.
@zhinkunakur4751
@zhinkunakur4751 Anos atrás
cmon are you really suggesting high level languages are bad and inefficient ? I believe high level languages are really inevitable
@zhinkunakur4751
@zhinkunakur4751 Anos atrás
what we should look at more is high efficiency conversion from Upper bound languages like basic English instruction to machine language using machine learning using the analog energy efficiency advantage we have , you cannot stop the inevitable but we can get more efficient codes and there isn't only one way to do it
@hgbugalou
@hgbugalou Anos atrás
@@zhinkunakur4751 I am not suggesting that entirely. High level languages are awesome and things like python have made countless cool and invaluable solutions and gotten a lot of people into coding. Part of the benefit of this more powerful hardware is the amount of abstraction that can be done and still get the job done nicely to the end user. My point was only to highlight that I worry that as this things advance, the lower level stuff will start to become lost and appreciation for how efficient low level languages can be will be more and more underappreciated due to lack of understanding or thinking its voodoo not worth getting into. The are still a lot of scenarios where efficient code matters, and the closer to hardware you are the better. It is important we do not lose site of that or let that knowledge become stale.
@zhinkunakur4751
@zhinkunakur4751 Anos atrás
@@hgbugalou I see , Agreed , I too am a little worried for the increasing unpopularity of LLLs , or maybe the concentration is seeming to be going down because more more people are getting into coding and vast majority of them are will be using the HLLs , and not that the growthrate of LLLS are going down maybe its just that HLLs have a higher growth rate
@mnomadvfx
@mnomadvfx Anos atrás
Hopefully generative AI's can do something about that. It's something I have often thought about when observing the painfully slow development process of new video codecs from ISA portable C code to fast, efficient ISA specific SIMD assembly.
@mlc4495
@mlc4495 Anos atrás
The human mind is a funny thing. This video brought a long dormant memory back to the surface and now I can't shake it. I recalled reading an article in a games magazine in the mid to late 90s that talked about a coming revolution in gaming: light based computing utilising diamonds! One company had created a games machine supposedly a 1,000 times more powerful than the, at the time, still speculative PlayStation 2 and would be releasing it "soon". I'd love to read that article again. I don't even know the magazine I read it in. Perhaps CVG?
@luka3174
@luka3174 Anos atrás
That said technology has experienced major innovations in the last 20 years, but news always like to make it seem like we’ll have futuristic technology in the coming decade
@femiairboy94
@femiairboy94 Anos atrás
It’s amazing that just a hundred years ago we barely had cars on the road. The speed at which technology is developing is something else.
@muthua756
@muthua756 Anos atrás
beautiful!
@ventusprime
@ventusprime Anos atrás
1:36 dust yes
@thesecondislander
@thesecondislander Anos atrás
What is the difference between these Neuromorphic chips and a neural network implemented in hardware?
@arlpoon6423
@arlpoon6423 Anos atrás
Fascinating
@nickvoutsas5144
@nickvoutsas5144 Anos atrás
Light traveling through optic chips is the future. Combined calculations of a traditional binary computer integrated with a quantum computer makes sense