Escuro

The Race to Build a Perfect Computer Chip

Bloomberg Originals
Inscrever-se
Visualizações 660 802
99% 14 000 1

Digital activity uses a huge amount of electricity with semiconductors near the limit of their efficiency. Now scientists are racing to perfect new chips that use much less power and handle much more data.

#thespark #technology #green
--------
Like this video? Subscribe: brvid.net/u-Bloomberg?sub_...
Become a Quicktake Member for exclusive perks: brvid.net/u-bloombergjoin
Subscribe to Quicktake Explained: bit.ly/3iERrup

QuickTake Originals is Bloomberg's official premium video channel. We bring you insights and analysis from business, science, and technology experts who are shaping our future. We’re home to Hello World, Giant Leap, Storylines, and the series powering CityLab, Bloomberg Businessweek, Bloomberg Green, and much more.

Subscribe for business news, but not as you've known it: exclusive interviews, fascinating profiles, data-driven analysis, and the latest in tech innovation from around the world.

Visit our partner channel QuickTake News for breaking global news and insight in an instant.

Publicado em

 

9 Nov 2022

Compartilhar:

Compartilhar:

Baixar vídeos:

Carregando o link.....

Adicionar a:

Minha playlist
Assista mais tarde
Comentários 735
First name Last name
First name Last name 4 meses atrás
0:00 The importance of developing low-energy computer chips 4:12 Carbon Nanotube Transistors 11:07 Photonics Chips 15:26 Neuromorphic Computing 24:47 Conclusion
Gig27
Gig27 Mês atrás
Thank you
Shengtao Wang
Shengtao Wang 3 meses atrás
thx
Luka
Luka 4 meses atrás
Thank you brother
Kevin Street
Kevin Street 4 meses atrás
Thank you!
Shivang Singh
Shivang Singh 4 meses atrás
Just want to thank the team of Bloomberg Quicktake for making these really high quality content for us 🙏🏻♥️
Nikhil Thomas
Nikhil Thomas 2 meses atrás
You are welcome Mr.Singhania.
Andrew Mellor
Andrew Mellor 2 meses atrás
they said something without saying anything
K O
K O 4 meses atrás
Epyc.
Jonah Nelson
Jonah Nelson 4 meses atrás
It’s genuinely great
ThinkTooMuch (b)
ThinkTooMuch (b) 4 meses atrás
I'd heard of these technologies to varying degrees, but this piece on the current progress of all of them was informative and fascinating. Thank you!
AtlanticLove
AtlanticLove 4 meses atrás
@Brendon Pitcher Anyone concerned about what we're doing to the Earth, absolutely yes.
Brendon Pitcher
Brendon Pitcher 4 meses atrás
@Wallon from all of this, thats what you take away?
Ottee2
Ottee2 4 meses atrás
@Alan hat , yes.
Alan hat
Alan hat 4 meses atrás
R. Daneel Olivaw had a positronic brain in the 1940s...
Freddo Flintstono
Freddo Flintstono 3 meses atrás
From my perspective this was one of the most interesting quicktakes I've seen. Well assembled and presented.
AdriEl Amadi
AdriEl Amadi 4 meses atrás
I'm a computer scientist specialized in software development but this made me appreciate the hardware side of things. It's really inspiring
Nayeb 222
Nayeb 222 4 meses atrás
This just renewed my interest in engineering, truly an inspiring documentary.
Nayeb 222
Nayeb 222 3 meses atrás
@Mr Fake thanks for the info
Nayeb 222
Nayeb 222 3 meses atrás
@Joshua Thomas sure,whatever satisfies your comfort zone
Mr Fake
Mr Fake 3 meses atrás
Look into metamaterials. First company to try and scale advanced nano materials
Joshua Thomas
Joshua Thomas 3 meses atrás
@Nayeb 222 you have faith in fiction
Nayeb 222
Nayeb 222 4 meses atrás
@a a I do have faith in religion, so it won't be an option for me
Milan
Milan 4 meses atrás
What a time to be alive. This is obviously exciting work but the skepticism in me assumes that once vendors see that they can do more with less, they'll just do more and so the cycle continues. What is also brutally hard about this work to take full effect is the next layer above this needs to also change to take advantage of these performance/energy gains (the firmware, instruction sets, the algorithms, programming languages and so on). Over the long term though, I'm optimistic of this shift
Riplikatln Loki
Riplikatln Loki Mês atrás
Silicon is common
DarkMagic666
DarkMagic666 3 meses atrás
@Amos Batto I would just slightly expand on your thoughts. In the case of desktops there have been great increase in efficiency over the last 20 years, through the increased density and lowering of voltage on CPUs. Hard drives were replaced with SSD drives. However, when it comes to GPUs, it's the complete opposite which is why the overall power usage has not declined significantly. It should also be pointed out that in most cases, this increase in GPU use of power has been for gaming. So the real question is how much of the lack of power efficiency in a desktop is due to the demand for better gaming and is this truly the way to power efficiency nirvana?
Sceplecture
Sceplecture 4 meses atrás
"once vendors see that they can do more with less, they'll just do more and so the cycle continues" thats called progression.
Mick Moon
Mick Moon 4 meses atrás
@Michael Edwards Rustacians always come out of their woodworks in these type of tech videos Rust is not the modern and better version of C until they sort out and upgrade memory management better than C but they will never sort it out like GNOME developers I have yet to see real actual upgrade to old pure C
boptillyouflop
boptillyouflop 4 meses atrás
@Michael Edwards If a piece of data is in a file, it will take a lot longer to read than if it's in RAM, yes. Everybody already knows that, you don't need to tell me. You already can change the microcode for some x86 CPUs (AMD K8, K10). But your custom instructions won't run any faster than regular code because the microcode sequencer can't generate any more operations than the regular non-microcode instruction decoder (it's not designed for speed, it's there for security and 16-bit compatibility). You're gaining nothing. Micro-op quantity is not tricky. Your operations on ALL out-of-order CPUs are: Load, Store, Arithmetic operations, and Jump. Everything else is just a combination of these including fancy stuff like Micro-op fusion and Macro-op fusion.
Clint Tapper
Clint Tapper Mês atrás
Fascinating, as Spock would say. It's an interesting decade or two ahead as we grapple with these different technologies, in the hope that one of them will become commercially mainstream - and breathe new life into the industry for another 50 years or so until something newer more more radical appears. Ica is also fascinating, especially it's neuromorphic learning paradigm, and will definitely accelerate the rate at which robot can learn their surroundings and interract, as well as learn from it's past, and build on its intelligence. The future is definitely bright.
hypertele Xii
hypertele Xii 4 meses atrás
This new chip sounds like a pathfinding co-processor to my game developer ears. Navigating in real-time an order of magnitude more agents in a dynamic world would revolutionize game development. Everybody's stuck on pathfinding. We're still using algorithms from the 1960s.
Omid Saedi
Omid Saedi 4 meses atrás
Such an awesome content that you produce. It has something to teach nearly anybody at any level of knowledge regarding the problem.
Raj Chakraborty
Raj Chakraborty 2 meses atrás
Didnt expect an analog solution to be listed in this but that sounds very promising along with the photonics one being already a proven technology, when it comes to consume less energy. My antenna cant catch Rest of technologies (infact none lol), but we have smart and dedicated fellows working on them. Best of luck for all of us!!!
Ashish Patel
Ashish Patel Mês atrás
Computer chips work in digital domain as it is easier for us too handle digital data in hardware. Analog design is really hard compared to digital design. I myself being a digital design engineer know how things get easier after analog to digital conversion. Real time analog data is easier to understand in binary.
Michael Lenczewski
Michael Lenczewski 4 meses atrás
The greatest enemy of a wonderful technological breakthrough is the advanced technology that works well enough.
Mountain Nomad VFX
Mountain Nomad VFX 2 meses atrás
Well ye, but more than that because the tech that works well enough is the one already receiving most of the investment. If silicon had become untenable 10 years ago we would have been forced to switch faster to something radically different. The same thing is true with NAND flash memory - it's truly terrible for power consumption, and the wear rate of each memory cell is anything but great for long term storage. But because the investment was so deep even the most promising advanced replacement tech has constantly been left to rot even as flash becomes an ever greater power drain on our mobile devices.
Khatharr Malkavian
Khatharr Malkavian 4 meses atrás
Nonsense. We're using all of our existing technology and pouring hundreds of billions of dollars per year into researching new methods.
Typhon Nyx
Typhon Nyx 4 meses atrás
yup the consistency is the death of developement
K O
K O 4 meses atrás
Epyc.
Owen Hand
Owen Hand 4 meses atrás
Photonics are the future. I've been blown away with the ways they are devising to build logic gates that function by altering photons on a quantum level. Light based computers have been a mainstay in Science Fiction for a long time now and it's amazing to see actual real-world advances with practical applications being made.
Icarus
Icarus 2 meses atrás
Data's neural network was photonic.
Koi Yujo
Koi Yujo 4 meses atrás
@Owen Hand yea
Koi Yujo
Koi Yujo 4 meses atrás
@Owen Hand yup
Imcons Equetau
Imcons Equetau 4 meses atrás
@Owen Hand A single company has a technological "moat" for making this UV equipment. Their production costs are increasing as the technological difficulty increases exponentially. Currently, tin droplets are precisely dropped into a focal point and vaporized to an excited plasma in order to emit the characteristic UV light. This tin element has to be changed to another one that emits even shorter wavelength. As wavelength is reduced, the masks and photoresist will have to be made compatible with higher energies into deep UV or even X-ray. Also, as the energy level is increased, this will lead to increasing ionizing radiation damage to equipment and semiconductor. Edge diffraction is an issue. Consistently reproducible X/Y physical positioning of wafer substrates and optics to well within 3 micron *for each chip position* over an entire wafer diameter is *already very difficult.* It will get harder as the demanded precision gets even tighter. This slewing, settling into position, and flash exposure must also occur more often as the number of chips on a wafer increases, which results in slower production.
Imcons Equetau
Imcons Equetau 4 meses atrás
@Amren Miller The Tesla Dojo tiles are each using 40 optical fiber interconnections to the adjacent [array] tiles. I presume most supercomputers are now using optical fiber connections between server cabinets, perhaps even between blades.
John Savard
John Savard 4 meses atrás
We've already made amazing strides in the power efficiency of computers. An IBM 360/195, with cache and out-of-order execution, like most modern computers, used much more power. And go back to the days when computers used vacuum tubes instead of transistors, and their power consumption compared to the work they could do was much higher.
Dyson
Dyson 2 meses atrás
Wirth's Law. We don't necessarily need better computers. We need software to be more efficient. Nowadays it's normal for computer programs to occasionally crash due to memory leaks or bug in the code. I work at a datacenter and I have to use this app on my phone to do daily routine inspections and the app crashes when open for too long... It's crazy how tolerant we became of unstable software.
Mountain Nomad VFX
Mountain Nomad VFX 2 meses atrás
That is true, but back when that happened the worldwide use of computers was just a tiny fraction of what it is now. The increase in use means we need to push the hardware efficiency ever further to keep up.
K O
K O 4 meses atrás
Epyc.
Martin Blake
Martin Blake Mês atrás
I was surprised that Mythic Chip was not included on this video to represent Analog. The Nuero Computing part that is being developed in Mumbai -in this video- has already been created by those guys years ago and currently has a computing power that is equivalent to current digital standards but using only 3 watts of energy.
pterandon
pterandon 3 meses atrás
Superb presentation. Both “pop culture” exposure, and real technical info for experts
mousatat
mousatat 4 meses atrás
Even though there is hundreds of company racing in this field, but all of them are pushing the world to the front even if they doesn't win the pie at the end.
Daniel Radnedge
Daniel Radnedge 4 meses atrás
Could we solve a lot of the energy problem by writing more efficient code? It seems that as processing power has increased developers are less concerned with memory constraints. There is also a lot of pressure to push new features at the expense of optimised code (and of course more and more abstraction layers in coding). It's like Parkinson's law, but with computer memory.
Preston Rasmussen
Preston Rasmussen Mês atrás
I think this comment is built on a false premise, that code has become less efficient over time. Nearly all code is run way more efficiently now than it was 20 years ago. It’s just that as processing power increase, programmers are able to create applications that do more. And in the arms race for consumer attention, you then have to do more in order to compete. So the issue is that more functionality is being added, not that the functionality is less efficient.
Andy Freeze
Andy Freeze Mês atrás
@Meleeman mate, i dont want to learn anymore than necessary when it comes to computers. They are supposedly made to adapt to us not us to a binary code. Yes i have done unix and linux before but i gave up, i dont wish to reinvent the wheel, i will let other nerds like you do that. Yes you can, yes i can do this and that but i chose not to. I can think of better things to do with my life.
King Chrome
King Chrome 3 meses atrás
@Meleeman Using c++ server side can have a reduction in energy usage
Meleeman
Meleeman 4 meses atrás
the quickest thing you could do is learn how to use linux and a terminal, and you would already be using less power the a majority of people, and use a window manager like i3wm. and use more terminal applications including on your phone like termux. its not as conveniant but you can do quite a bit with a terminal. so much i'm convinced that real work is done in a terminal.
Meleeman
Meleeman 4 meses atrás
nope. because in order to make money code needs to be shipped fast. there are better. what we can do is encode more in less. so instead of binary computers we use ternary, or even quarternary computation. and that could increase the amount of possible calculations. the reason developers are less concerned with memory constraints is because its expensive to write efficient code, as it takes longer and you need to understand more math and how computers work in order to write efficient code. its also more prone to bugs and errors. what you need is something simple enough to write but provides enough control for the task at hand. and most people don't even know until the product is shipped, optimizations happen after the product is built. a real solution would be using analog computers and a whole bunch of them to do specific calculations, and then translate them into binary. this in principle is how and why asic mining exists because instead of abusing sand and making it think, we simply just let it read the electrical charge outputs from several mechanical computers and let it process those inputs via conventional silicon which will need less power to operate since it need only read from its mechanical computer counterparts and maybe do a few calculations here and there.
F S
F S 2 meses atrás
Maybe dielectrophoresis in combination with flow fields in solution is a way of tuning and improving the alignment over pre-treated (for instance lithography) inhomogenous surface energy Si wavers. Worked out pretty well for GaAs nanowires in a study we conducted at the university to align them parallel at contacts.
G H
G H Mês atrás
My dad works as an electrical engineer in the semi-conductor industry. Pretty crazy stuff.
Eremon1
Eremon1 4 meses atrás
I believe the next step will be implementing a practical transistor layout in 3 dimensions instead of the two we currently use. Similar to 3D NAND but far more complicated. Still doable. Intel has been working on it for some time behind the scenes though nothing commercial has appeared on the market as far as I know. I think they're currently working on a way to stack the N and P transistors instead of placing them side by side. Allowing for an increase of total transistors by up to 50% on the same footprint.
Imcons Equetau
Imcons Equetau 4 meses atrás
Stacking is an effective path to higher density IF the generated heat can still be conducted away rapidly enough. For example, memory circuits typically are lower power than processing circuits, so memory can easily be stacked. At the moment, memory chips are diced apart, tested, and bonded into stacked silicon sets that pass buss signals vertically. Four layers are already common, and thicker stacks are coming. Typically, the validated memory stacks, processors, capacitors, and interface chips are bonded to a multilayer metalized silicon substrate that passes power, bus, and clock signals among the chips.
WeirdscienceTV
WeirdscienceTV 4 meses atrás
Neuromorphic computers are the next key technology. What would be interesting is if chips can be more three dimensional, as opposed to the relatively two dimensional chips afforded by conventional lithography techniques.
trolly
trolly 4 meses atrás
Can confirm we are running out of chips, the chip-to-air ratio changed from 50-50 to 35-65. Trying times indeed. The bag itself is worth more than the chips inside now.
No Name
No Name Mês atrás
It will be mayhem
gusauriemo
gusauriemo 2 meses atrás
Nengo is a software that works with Loihi currently with the intention of allowing software applications for the neuromorphic chips. The research they do at Waterloo in general is quite interesting
Bloody Mobile
Bloody Mobile 3 meses atrás
I'd like a study on how much % of this power consumption falls on user profiling and the processing needed for it. I wouldn't be surprised if it's around or above half of it...
femiairboy94
femiairboy94 Mês atrás
It’s amazing that just a hundred years ago we barely had cars on the road. The speed at which technology is developing is something else.
Deiphos Antonios
Deiphos Antonios 4 meses atrás
People always talk about how more effeciency will lead to less energy consumption. But if I know anything about humans, is that they will always push to limits (and power/thermals are the limiting factor right now), so I feel more effecient chips are just going to lead to even more computers and increased performance instead of decreased power draw.
Rob
Rob 4 meses atrás
Energy consumption would increase either way
Kyle F
Kyle F 4 meses atrás
Literally just passed my computer architecture final and then this was recommended to me haha. Great video!
De Kev
De Kev 4 meses atrás
As a microelectronics engineering grad student, I'm very well aware of the major challenges that power optimization can pose. There have been many attempts to "cheat" the physical boundaries of materials, some have been successful, some have lead to entirely different technologies.
boptillyouflop
boptillyouflop 4 meses atrás
Any technology that can build a >1Ghz 32bit exact adder...
Typhon Nyx
Typhon Nyx 4 meses atrás
really like for example
Generic Youtuber
Generic Youtuber 4 meses atrás
I wonder if theres a way to make them resonate together, with sound, or light or something. if they all vibrate together, they may shuffle into place better.
Roopkaran Singh 179
Roopkaran Singh 179 4 meses atrás
Amazing content, so cool information, please keep coming up with these kinds of videos
PrivateSi
PrivateSi 4 meses atrás
Nice succinct, informative, up-to-date vid and objective analysis. Photonic computing is definitely the way forward. Neuro-photonic and even bio-photonic computing will combine well in the future when the tech. is worked out. 1000x more computing using 1000x less power within 20 years. Moore's Law will be utterly broken, but in a productive way via a large tech. leap or two, rather than slowing to a standstill as pushed by many youtube vids.
Nic
Nic 4 meses atrás
Great video! Would have loved to see DNA computing covered as well
J&B beiser
J&B beiser 4 meses atrás
Me to .....1 cubic cm. of DNA is capable to store all the information of all datacenters in the world 🌎 ( I was told )
Ankit kumar
Ankit kumar 4 meses atrás
No matters how fast and efficient chips will be made, software makers will eventually slow down it or waste computing power on nonsense futuristic looking features to make money and gain market share.
sergio espitia
sergio espitia 4 meses atrás
Definitely. This is a pretty important documentary to inspire engineers all over the world.
Alex Trusk
Alex Trusk 4 meses atrás
I'm glad to own some oldshool household hardware like my cettle and toaster that don't rely on chips and last for like 3 years, but for over 30 years and counting (got the stuff from my parents and maybe i give it to the next generation as some magical relics with of another time)
Steve Sedio
Steve Sedio 4 meses atrás
Most of the data that moves around a server farm, goes over copper. Even when computers are paralleled. Light travels through fiber at 65% speed of light, through copper at 60%. The devices that convert data to light have the same limits as the devices that drive wire. Light can send more than one signal using color, but that only uses a small slice of the available bandwidth. Copper wire operates at a lower frequency (maybe 10GHz vs 50,000GHz), but uses the entire bandwith of the wire. The big advantage fiber has is how far a signal can travel.
Chi
Chi 4 meses atrás
15:11 The interconnect cables which are devised to mitigate energy consumption challenges in data centers are just simply optical fiber interconnects which are directly plugged to the ASIC. Co-packaged optics technology bridges the gap between electronics and photonics by integrating them on a common platform with photonic chip serving as the first point of contact to the external world.
djayjp
djayjp 4 meses atrás
Still a lot better than delivering physical goods (eg Blockbuster vs streaming). I'm sure the former uses more than 100x as much energy.
djayjp
djayjp 4 meses atrás
@mynameisjoejeans Right but mobile devices are also incredibly efficient. Point is that I don't think media consumption time has increased significantly or kept up with the vast decrease in energy consumption afforded by digitization/streaming. Counter point: 4k 60fps and scrolling through Tik Tok streams. Still think it's handily in favour of energy reduction on a per capita advanced economy basis.
mynameisjoejeans
mynameisjoejeans 4 meses atrás
@djayjp hours of tv watched is an unrepresentative metric, as digitisation has led to many more content forms which are no longer classed as TV. Streaming, social media etc are all new categories that when combined constitute far greater consumption than was possible via analogue technology.
djayjp
djayjp 4 meses atrás
@mynameisjoejeans Right but I bet, *all else being equal*, energy consumption is still down in that regard (media consumption). One can look at the number of hours of TV watched for example and I don't think it has changed much over the past few decades.
mynameisjoejeans
mynameisjoejeans 4 meses atrás
@djayjp The Jevons Paradox is fairly well documented in a variety of contexts. Energy and resource consumption due to convenience is certainly one of the most prominent examples.
Nivesh Proag
Nivesh Proag 4 meses atrás
@Floris Maybe you guys are somehow under the impression that I'm against going digital. Not at all. I am extremely pro-digital, makes everything more efficient and convenient. I was thinking of the universal truths of how that extreme convenience always create infinitely more demand. One quite random example is the traffic problem. We attempted to solve it by increasing the number of lanes and finally just realised that it creates more traffic when we do that. Our models seem to show that even going autonomous may not solve traffic just because so many more people will be travelling. Elon had to come up with boring company to attempt to solve traffic.
V A
V A 4 meses atrás
I appreciate these new insights. Thank you Bloomberg. Truly exciting new developments.
Bhuvanesh s.k
Bhuvanesh s.k 3 meses atrás
As an Indian I have to compliment the US for pushing these bleeding edge r&d. I work at neuromorphic computing area in India & I'm sure India will start competing with USA soon
Jenny Chuang
Jenny Chuang 2 meses atrás
@J X Hahaha You must be joking
Bhuvanesh s.k
Bhuvanesh s.k 3 meses atrás
@J X agreed China is leading compared to India
J X
J X 3 meses atrás
@Bhuvanesh s.k Compared to India. India has potential but it just isn't as big as many people think
Bhuvanesh s.k
Bhuvanesh s.k 3 meses atrás
@J X u mean compared to USA?
J X
J X 3 meses atrás
China's already much further ahead in all types of chips, whether carbon, photonic or RISC
Jared Spencer
Jared Spencer 4 meses atrás
It's got to be tough for these new technologies to compete with Silicon, which has had 50 years of uninterrupted existential growth. Even if a new technology could be better than Silicon, it might never get there because it can't be immediately small enough or fast enough or cheap enough to compete with the cutting edge.
femiairboy94
femiairboy94 Mês atrás
They will eventually get cheap enough, the beauty of the capitalist system. 50 years ago owning a computer was impossible. Today the average American has two computers.
Latvia Lava
Latvia Lava 2 meses atrás
New challenges will create New opportunities. Maybe, not with commercial applications but, these technological breakthroughs will initiate their own journey with defence and space Applications !!!
Pickle Rick
Pickle Rick 4 meses atrás
So maybe the AI bottleneck (ie. class 5 autonomy, 'general ai', etc) is due to the binary nature of the base layer architecture - from this it sounds like the analogue stochasticity of neuromorphic architectures may be required for AI to meaningfully progress...
Jason G
Jason G 4 meses atrás
Amazed how long these alternative silicon methods have been in development. Seems like we’re stuck with wafer silicon for this generation
Rob
Rob 4 meses atrás
Lets hope not
Michael O
Michael O 3 meses atrás
Software development is also important and is rarely considered in these type of scenarios pertaining to compute efficiency and carbon output. Today's developers are writing bloated inefficient code using high level languages that just add even more overhead. This comes out as wasted CPU/GPU/DPU cycles and thus wasted energy. To some degree the increase in power of the hardware has caused this as before developers had to be much more diligent about writing lean code.
Jack Evans
Jack Evans Mês atrás
Arduino is an example of this. The programmer, if you can call him that, has all the clever libraries written by someone else and all he does is connect these together. He then believes he's written some code, with no understanding of anything at the hardware level.
Omar Ahmed
Omar Ahmed 2 meses atrás
@Michael O that should be a completely separate job from the developer , the developer only needs to develop , but the conversion itself is the optimization factor and that can be done separately by those who create the languages
N Tal
N Tal 2 meses atrás
@Zhinku Nakur The OP is correct, I write embedded code in C, yet I use 128Kilobyte of flash(rom) and 8Kilo byte of ram(actually 6K) not Mega bytes Not Giga bytes, yet its considered leading edge in the industry its put into.... yet PC/Android/Linux consume so much room... unlike my bare metal programming My power use without even trying to green it ... 2.4 Watts that max during motors driving. drops to under 1 watt running the rest of the time 99.9% time. Motor only run 8 seconds.
learner coder
learner coder 2 meses atrás
Interesting point. I would say compilers are fairly bloated (unnecessary libraries added), but in terms of programmers - it depends as some write their own code for each thing while others use libraries (which does add unnecessary overhead) and since companies like their programmers to be as efficient as possible, library based programmers will become more prominent. Personally, I think you can run a lot of real resource intensive programs if we multi-threaded our programs or programmed them in some lower level languages (eg. Assembly) but that would contribute to the shortage of programmers as Assembly creates a very steep learning curve. So mabye?
Emi
Emi 2 meses atrás
@imeaktsu7 That's the whole point, it should be taken into account. There is a nearly 2 decades power efficiency gap between C (or Rust) and Python. Also we should aim to more frugal softwares and websites, with less useless stuff.
May the Science be with You
There is an over abundance of silicon on this world. These carbon nanotubes would have to be significantly better to replace silicon. I feel like this is one of those things some people just would like to push because that's what they are working on, so of course they promise a lot.
Eric Potter
Eric Potter 4 meses atrás
AI is already improving itself by taking less steps to calculate numbers, this doing things 10% faster and of course the faster your chip gets, the more efficient it will be. More than half the power required to run these huge supercomputers and data centers rely on Air Conditioning to keep all that chips cool. So, more efficient will also cut down on AC requirements. I believe that the more advanced AI get, it could make chips over time beat out quantum computers. I should state, that my Mac M1 chip uses, no cooling, no fans. Totally relies on on passive cooling, one day high end server chips will be so efficient like the M1 Chip, won't need any fans or cooling at all.
femiairboy94
femiairboy94 Mês atrás
“It could make chips to beat quantum computing”, based on what I know about quantum computing, I don’t even think that’s possible. Quantum chips calculate the probability of an objects state before its even measured.
Serge Billault
Serge Billault 4 meses atrás
The best alignment process I know of is using magnetic fields. Is there a way to make these nano tubes or the environment in which they are stored temporarily magnetic?
JoelW
JoelW 4 meses atrás
Interesting idea, but I don't think pure carbon has any ferromagnetic properties.
The End of Infinity
The End of Infinity 4 meses atrás
Imagine how much we could save if millions of people weren't living online every waking minute seeking validation, and simply put thier phone down. It would save not only energy, but humanity itself.
Sumit Pal
Sumit Pal 3 meses atrás
Remember, it is these validation seeking individuals which are pushing scientists and engineers to innovate and come with fundamentally new solutions, ensuring that we progress as a species. If we didn't need to upgrade, no one would care to innovate, and we would still be happy with stone-age technology.
Rogério Batista
Rogério Batista 4 meses atrás
Science is just wonderful! My acknowledgements for all researchers involved in this process.
Taylor Falk
Taylor Falk 2 meses atrás
I’d imagine computers of the future having computer cubes instead of computer chips. Just layers on layers on layers
F P
F P 4 meses atrás
I'm a science nerd and fan. But I hope there are simultaneous efforts to develop safe disposal of carbon nanotube solution 5:19. The tubes are too small to filter conventionally, and they don't easily degrade. Waste has to be considered, particularly since 33% of it is already known to be unwanted by products (full time conducting nanotubes).
WulfCry
WulfCry 4 meses atrás
Reducing environmental impact by choice is said to be used as a performance measure advancing chip design. Could be worst off in no development releasing them, Most designs end up on the shelf without ever being released not even partly small functional designs. All costs go to the larger processing of data while the least of data processing works as well to uncover much of a design. However, I applaud how they go about it.
Celdur
Celdur 4 meses atrás
I think we will always pump more power even if we have more efficient chips, because there is no limit to what we want to do with them. So cool, more efficient chips, its great, BUT we will still increase your energy consumption.
aresmars2003
aresmars2003 4 meses atrás
At least in Minnesota, during winter heating season, I figure waste heat from my PC all goes into heating my home. Of course if I had better home insulation, I'd probably save more in heating that way!
Phillip Simmer
Phillip Simmer 4 meses atrás
Very high quality content. Thank you!
James Janse
James Janse 4 meses atrás
If the production process always creates metallic nanotubes as a by-product, could those be aligned magnetically before removing them from the semiconducting ones?
G Kess
G Kess 4 meses atrás
Not when they are copper.
John Dawson
John Dawson 4 meses atrás
This was The best explanation i have heard for quantum tunneling. Thanks guys.
malectric
malectric 4 meses atrás
I'm in awe of the technology and ideas which have been developed to enable material manipulation at molecular and atomic scales. Just amazing. My choice application of new AI technology: to recognize and edit ads from BRvid videos.
AlanTheBeast100
AlanTheBeast100 3 meses atrás
I heat my house with electricity for nearly 6 months per year. Therefore 100% of electronics use in the house is 0 energy consuming during that time. Data centres should be located where there is a need for heating water or other materials. Thus all of the heat can be dumped in the early part of the manufacturing process.
Paul Spvk
Paul Spvk Mês atrás
@AlanTheBeast100 Makes sense.
Paul Spvk
Paul Spvk Mês atrás
You're correct about the second part. But unless you're using a space heater instead of an HVAC, it's not zero. Heat pumps which move thermal energy are actually more efficient than pure electric to heat converters.
Michael McCoubrey
Michael McCoubrey 4 meses atrás
photonic computers, neuromorphic computers, and cpu's that use carbon nanotubes are very intresting but frankly if we wanted to dramatically reduce computer power consumption we could already do this today. We could: - use the programming languages C or Rust instead of popular programming languages like Python (which is something like 45 times less efficient) - use RISC based CPUs such as ARM chips or RISC5 chips - underclock CPUs so that they maximise power efficiency rather trying to maximise performance - use operating system drivers that aim to use minimal power If we did these things we could probably use < 1% of the power we currently use. We don't do these things largely because it would be slightly more inconvenient and would require social change rather than innovations in technology.
Bhuvanesh s.k
Bhuvanesh s.k 3 meses atrás
@Eric MOULOT first of all his point number 1 about python and c is ridiculous. And 2ndly risc v or any risc architectures can't replace cisc. Most of data centre cpu run of cisc for a reason
Bhuvanesh s.k
Bhuvanesh s.k 3 meses atrás
Bruhhhh ur point number 1 doesn't make any sense. U r talking in the software domain. Plus there's a reason why everyone use python. It's faster to build and test
Eric MOULOT
Eric MOULOT 4 meses atrás
@Raju Aditya Sounds like a reasonable argument to me. Maybe you have some insights to share that'd change my mind and his?
User
User 4 meses atrás
@Raju Aditya we all are interested learning and yeah we normies may not understand key concepts
Raju Aditya
Raju Aditya 4 meses atrás
This is such a normie take that it is hilarious.
Bemana Prayogoベムプラ
Remember, there's no perfect chip. This isn't a race to create the One Chip, this is a game of balance and adjustments.
Nicolas Dujarrier
Nicolas Dujarrier 4 meses atrás
I think a few other options have not been duscussed liked spintronics (with MRAM already on the market), and maybe (flexible) organic electronics…
Shapeless
Shapeless 2 meses atrás
There are datacenters that reuse the heat generated to provide central heating to towns around them. That’s just one example of how much power is wasted on computing - if it’s enough to heat the houses around you and it’s actually even profitable to do that.
John Cipolletti
John Cipolletti 4 meses atrás
As a computer programmer and designer, I want to say something here. If you can make a perfect chip....what about all the other factors? The biggest problem would be the faulty software that will run in the system. Also, what about the heat build up in the system? That could change the results. Finally, a change in the electrical current could cause some big problems.
John Cipolletti
John Cipolletti 4 meses atrás
@User That is the problem. Can you say untested loops. Make an error like that and things fall apart. As long as humans have a hand in the process, there will be mistakes! Can you say self-driving Tesla auto?
User
User 4 meses atrás
You will have to emulate / translate older code and hope it works
Don Lee
Don Lee 2 meses atrás
I mean... perhaps we can talk about Apple who stopped using Intel chips because they were not only slow, and a development dead-end, but also very power hungry, which created a lot of heat, which required more fans for cooling, which required more power. Instead now their chips use a LOT less power are more powerful than before, and require less power to use or charge.
Thomas Collingwood
Thomas Collingwood 4 meses atrás
The Cerebras wafer 80 EXO scale processor has 850,000 cores, each core itself is a supercomputer. All ultra interconnected without bus speed, it outperforms every supercomputer ever built, all on one chip. They believe they have found the pathway to singularity. I gather the only supercomputer that's faster, doesn't exist. 44,000 watts, perhaps it could two jobs at once, heating water in the room while predicting the future, it's that fast. You know like when you make a movie and it takes forever for the processor to get done with it. Pictures simulating a nuclear explosion, fluid dynamics. Current supercomputers draw the event much slower than it happens in actuality. This chip can do more work faster and predict the event accurately in great detail faster than it can occur. Made that tsmc at 5 nanoscale. Strangely, Moore's law will continue. IBM has already produced chips at the two nanometers, so surely there's lots of room for improvement yet to come for the cerebras wafer supercomputer.
Robert Pearson
Robert Pearson 2 meses atrás
You could use q-nary logic like in flash memories. You could use adiabatic circuits. You could use self-time circuits. You don't have to stay in the same rut, doing the same thing over and over.
mako yass
mako yass 4 meses atrás
Hey, bloomberg, could you put links to the things you discuss in the video description? I'd expect your viewers to be pretty likely to want to look further into things and read stuff.
Keiko Mushi
Keiko Mushi 2 meses atrás
Also, moving away from rare earth metals and minerals to more common metals and minerals would be a game-changer if somebody can make said technology powerful and energy efficient. It would allow third-world countries more ability to compete and thrive as a result. We just need to deal with antitrust and insider trading issues that often hamper the ability for people to compete with larger corporations that could potentially 'buy them out'.
Nevin Plove
Nevin Plove 4 meses atrás
Hopefully these innovations make quieter notebooks & fanless laptops more widespread.
Ronald Hn
Ronald Hn 4 meses atrás
Thanks so much for such beautifully illustrated video about modern economy, and its current limits, and those with courage and wisdom to go beyond, yet at same time hopefully to also start reversing our energy footprint causing climate change disaster.
Jameel
Jameel 4 meses atrás
This was a great little mini-documentary.
mlc449
mlc449 4 meses atrás
The human mind is a funny thing. This video brought a long dormant memory back to the surface and now I can't shake it. I recalled reading an article in a games magazine in the mid to late 90s that talked about a coming revolution in gaming: light based computing utilising diamonds! One company had created a games machine supposedly a 1,000 times more powerful than the, at the time, still speculative PlayStation 2 and would be releasing it "soon". I'd love to read that article again. I don't even know the magazine I read it in. Perhaps CVG?
Luka
Luka 4 meses atrás
That said technology has experienced major innovations in the last 20 years, but news always like to make it seem like we’ll have futuristic technology in the coming decade
Qwerty Ali
Qwerty Ali 4 meses atrás
this really got my attention becuz of my deep interest in science and tech, thanks bloomberg!!
IceArdor
IceArdor 4 meses atrás
15:04 "in theory you could have a processor in one room, memory in another, storage in another." Maybe, but interconnect latency is still a concern. Light travels 30 cm (1 foot) in 1 nanosecond, which is the duration of one clock cycle when the computer runs at 1 GHz. Latency to communicate with RAM in an adjacent room will limit the usability for many computing applications.
Chuck Kottke
Chuck Kottke 2 meses atrás
Well that was fascinating and encouraging! The future belongs to the efficient. 🔵
RustyBolts
RustyBolts 4 meses atrás
Thanks to this video, I can look forward to the day when we can upgrade our computers by drinking a pint of self organizing chips which will auto integrate with 'natural' neural networks and also maintain our biological age to about 25 years young. "Life ain't fair, just when your starting to understand things, you die." Einstein.
Koi Yujo
Koi Yujo 4 meses atrás
I'm very happy analog computers are coming back from the died they deserved it sense we need computers more like our brains
Johnathan Brown
Johnathan Brown 4 meses atrás
As an electrical engineering PhD, you're giving me thesis ideas, lol.
Sorry I'm Late
Sorry I'm Late 4 meses atrás
imagine giving people in previous centuries a smartphone today that can be 100X faster than their giant machine..
Leul Shawel
Leul Shawel 4 meses atrás
The robot was almost perfectly like a human child. curiosity, quick learning and poor coordination. I can't wAIT TILL IT GROWS UP
Guilherme sampaio de oliveira
Isn't there a possibility offsetting the costs of these technologies just making the SoCs much smaller?
Steve Louis
Steve Louis 4 meses atrás
Wouldn't a high frequency vibration like ultrasound while in suspension help to align the nanotubes?
Mr. G-sez
Mr. G-sez 4 meses atrás
what about aligning the nanotubes with gravity? would that be possible? centrifugal or electromagnetic and centrifugal
Hui Li
Hui Li 4 meses atrás
Yea, it's an interesting puzzle
Eden Lumbroso
Eden Lumbroso 4 meses atrás
I really like the jumps between the founders and the skeptic guy
Vignesh S
Vignesh S 4 meses atrás
Wonderful...Great task....Enormous knowledge...Thankyou all....
T J
T J 4 meses atrás
When people got to make fuel efficient cars... they made even larger ones, increasing fuel consumption. I'm confident the chip industry will be different 🤣
Próximo
Is CO2 Removal Ready for Its Big Moment?
16:21
The End of Cheap Chinese Labor
16:58
Visualizações 438 992
Diablo 4 Beta Impressions: Initial Review
31:52
How China May Soon Lead the Bio-Revolution
24:10
How does Computer Memory Work? 💻🛠
35:33
The End of Cheap Chinese Labor
16:58
Visualizações 438 992