I'm a Systems Reliability and DevOps engineer for Netdata Inc. When not working, I enjoy studying linguistics and history, playing video games, and cooking all kinds of international cuisine.
I think the single biggest thing long-term is probably going to be working harder to improve energy efficiency of hardware.
Put simply, data processing requirements aren't going to go down any time soon, so we should be focused on improving the computational power per-watt of computers.
Luckily, things are already (slowly) moving in that direction for other reasons. ARM, if it ever becomes big in the HPC or desktop/laptop markets, will likely help a lot (it's got significantly more processing power per-watt than most x86 chips do), and I suspect RISC-V will help too if it gets significant uptake. It would be nice if this movement could be faster, but the fact that it is happening at all at all is still rather reassuring.
I'm a Systems Reliability and DevOps engineer for Netdata Inc. When not working, I enjoy studying linguistics and history, playing video games, and cooking all kinds of international cuisine.
It might. Right now though, it's too soon to tell because:
Quantum computing only improves performance for some things, not everything. You need actual quantum algorithms, and it has to be something that can actually benefit from the parallelization involved. I/O, for example, is never going to get faster because of quantum computing, because there's just nothing that it can improve there (except possibly transparent compression or encryption, but that remains to be seen).
It's still rather inefficient. The absolute scale of quantum computing is too small to be useful on a practical level right now. If things keep improving, this will probably eventually change, but I doubt it will be soon.
Yes right now its not practical but I hope that in the future it helps as that is the only path forward as current technology innovation has reached the peak and Moore's law seems to be slowing down already google.com/amp/s/www.cnet.com/goog...
For further actions, you may consider blocking this person and/or reporting abuse
We're a place where coders share, stay up-to-date and grow their careers.
I think the single biggest thing long-term is probably going to be working harder to improve energy efficiency of hardware.
Put simply, data processing requirements aren't going to go down any time soon, so we should be focused on improving the computational power per-watt of computers.
Luckily, things are already (slowly) moving in that direction for other reasons. ARM, if it ever becomes big in the HPC or desktop/laptop markets, will likely help a lot (it's got significantly more processing power per-watt than most x86 chips do), and I suspect RISC-V will help too if it gets significant uptake. It would be nice if this movement could be faster, but the fact that it is happening at all at all is still rather reassuring.
Yes, definitely agree. Excellent point. Do you think the advancement in Quantum computing can help?
It might. Right now though, it's too soon to tell because:
Yes right now its not practical but I hope that in the future it helps as that is the only path forward as current technology innovation has reached the peak and Moore's law seems to be slowing down already google.com/amp/s/www.cnet.com/goog...