1) Bigger chips.
2) Stacking layers (3D)
3) Stacking chips.
It could be argued, Ray, that that's technology 'evolving'. Not chips evolving but technology generally.
1) Bigger chips.
2) Stacking layers (3D)
3) Stacking chips.
Google, Apple, Uber et al,
No I didn't, 1998 to 2007 smart phones and data existed. The technology to do it economically is over THIRTY years old. But you either got billed per second for connect time OR about 1000x more than current data plans. Till subsidised handsets and dataplans came in only some corporates did it.Because you argued hypocritically. You said the smartphone was created due to one thing and then you said it was created due to a different thing! Funny.
Moore's Law never was a law. Only Moore's target for Intel, revised 3 times. It's been dead for years.
Where is the new science to continue any rapid technological advancement?
- Genetic Engineering.
- Bulk bi-stable materials for replacing Flash memory.
- Graphene maybe for something ... Problems with production
- ....?
What else?
Considering the fact that we have done SO much over just the last 100 years, do you see technological advancements slowing down at all?
Could Self-Driving Cars Spell the End of Ownership?I think that the future of transport is extremely uncertain, because of the rise of driverless cars. Uber is already making serious inroads into the taxi business; in the not too distant future, I can see taxi firms become essentially garages for fleets of driverless cars. Were I a youngish taxi driver, I would be looking for a new job (and/or training for one) right about now.
The same phenomenon might well be the death knell for private ownership of transport, at least in cities and big towns. It's already a pain in the posterior, and an expensive one at that, to keep a car in a major city. But taxis are expensive because someone has to drive the cab; and he wants to be paid, and can't drive 24/7 if he wanted to. But if one could whistle up a robocab from somewhere reasonably close by and let someone else worry about maintaining it and keeping it charged...
I also think that various city governments are going to fight hard against driverless cars. Why? Because parking fees and penalties (and penalties for traffic misdemeanours) are a major cash cow, that's why. And robocars will not break the law - at all, most likely. That won't be the reason given, of course.
One more thing: A major obstacle might be the legal one. Say that a pedestrian (or early on, a car with a driver) is hit by a robocar. Who is responsible? The driver? There isn't one; that's the whole point. The manufacturer? The company that wrote the software?
Seems to me that the question is not how quickly technology is evolving/advancing, but where these developments are taking place. While we continuously pour resources into cell phones, self-driving cars and other non-essential gadgets, we are funneling little or nothing into our long-term survival as a species.
Yep. That about sums it up. The trillions spent on commercial rubbish we don't need could be spent on our actual future as a species.
People have brought up various technologies like the Driverless cars and stimulated AI to discuss. That was more in line with what I was going for. How fast do you think, those technologies will evolve.
No. On three levels! (ironically).Computers will evolve from binary to ternary.
Actually no!Faster than light communication. Entangled pairs are one way that is theoretically possible now
No. On three levels! (ironically).
Abandoned long ago as was decimal rather than binary based numbers. Binary is the most efficient.
Inherently all (non-quantum) digital computers are based on switches. These can only be binary, on or off.
ARM is replacing Intel x86 and is simpler and smaller instructions.But a continued development of binary will only result in computers that can execute more and larger instruction sets. They'll get faster and be able to do more things at one time, but we'll never reach AI like that. We must take computers and languages into another dimension (figuratively speaking.)
It was a LATE idea, quickly abandoned. All the first digital computers used binary, i.e. on/off switches made from relays or valves (tubes), from the Z1 in 1938. Later transistors replaced valves.Abandoned in the early stages of of digital programming because binary was by far easier
Zero computational advantage and a disadvantage. Boolean Algebra needs binary.Ternary computers operate with switches just like binary, but instead of On and Off, they have -1, 0, and +1.
No, because such structures are inefficient and switches are either on or off. It's just technobable, nonsense.Ternary computers operate with switches just like binary, but instead of On and Off, they have -1, 0, and +1
Absolutely no serious hardware designer or serious computer scientist espouses Ternary. Ternary is no help at all for AI.The end result is a computer that can make a decision of sorts, or at least computate dimensional variables. Binary programming, as good as it is, is purely linear. Right now, 56 years after the Setun ternary computer was built in the USSR, several teams are working out the necessary logic gates for modern ternary computers. I'm certainly not the only one who thinks that its the next logical progression.
Where would we be today if we had always accepted existing technology as the best and only possible?Abandoned in the early stages of of digital programming because binary was by far easier. Yes, I agree. Development teams were just trying to design operational computers. Why make it more complected than it already was? But a continued development of binary will only result in computers that can execute more and larger instruction sets. They'll get faster and be able to do more things at one time, but we'll never reach AI like that. We must take computers and languages into another dimension (figuratively speaking.)
Ternary computers operate with switches just like binary, but instead of On and Off, they have -1, 0, and +1. The end result is a computer that can make a decision of sorts, or at least computate dimensional variables. Binary programming, as good as it is, is purely linear. Right now, 56 years after the Setun ternary computer was built in the USSR, several teams are working out the necessary logic gates for modern ternary computers. I'm certainly not the only one who thinks that its the next logical progression.
I don't understand what error correction has to do with anything. Developers will certainly employ it in their ternary OSs. I also still think that the end result will be something of a hybrid, a ternary central processor and ternary memory coupled to binary data nodes. The data will be translated into ternary when needed.
I don't think anyone involved in Engineering in the 8000 years has thought that?Where would we be today if we had always accepted existing technology as the best and only possible?