Or, `The more things change, the more they stay the same`. A long long time ago computers were big. They took up entire rooms and required cadres of experts who would care for them, take your punch cards, and then some time later tell you that you punched the wrong hole on card 97. So you'd redo card 97, resubmit your cards, and get the next error. Rinse and repeat until lo, an answer appeared.
Then computers got smaller. You could put two or three in a room. Someone added a keyboard and a roll of paper towels, and you could type at it. And get an answer. And be miles away connected by a phone line. Time and Moore's law marched on, and computers got smaller and smaller. Teletypes turned into vt100 terminals, then vector displays, and finally maps and bitmap displays. Thus was born the Mother of all Demos. And it was good. Altair then Sinclair, Atari, Commodore, and IBM started making personal computers. They got even smaller and Osbourne gave us suitcase sized luggables. IBM and Apple made them easier to use. Then the Macintosh made them "cool". And all the power was on the desktop. And you could run what you wanted, when you wanted to.
Then Sun came along, and the network was the computer. At least for the enterprise. Centralized services and data, with lots of thin clients on desktops. Solaris. Sparc. XWindows. Display Postscript. Control. The cadre in the Datacenter told you what you could run and when. They controlled access to everything.
But what about Microsoft? A computer on every desktop (running Microsoft software). And it became a thing. NAS, SAN, and Samba. File servers were around to share data, but they were just storage. Processing moved back out to the edge. All the pixels and FPS you could ask for. One computer had more memory than the entire world did 20 years earlier. We got Doom, and Quake, an MS Flight Simulator. But all those computers were pretty isolated. LAN parties required rubber chickens and special incantations to get 4 computers in the same room to talk to each other.
Meanwhile, over in the corner DARPA and BBN had built milnet, universities joined bitnet, and computers started talking to each other, almost reliably. Email, usenet, and maybe, distributed computing. TCP/IP for reliable routing. Retries. Store and forward. EasySabre. Compuserve. Geocities. Angelfire. AOL, and the September that never ended. The internet was a thing, and the network was the computer again.
And now? The speed of light is a limit. You need things close by. Akamai isn't enough. Just having the data local doesn't cut it when you need to crunch it. and the big new thing is born. Edge computing. Little pockets of local data that do what you need, and occasionally share data back with the home office and get updates. Hybrid Cloud and on-prem systems. It's new. It's cool. It's now
It's the same thing we used to do, writ large. Keep your hot data close to where it's going to be processed. It doesn't matter if it's RAM vs. drum memory, L1 Cache on chip vs. SSD or SAN vs. Cloud. Or I could tell you the same story about graphics cards. Bandwidth limited, geometry transform limited, Fill limited, Power limited. Or disk drives. RPM, Density, Number of tracks, Bandwidth, Total storage. Or display systems. Raster, Vector, Refresh rate. Pixel density. Or, for a car analogy, internal combustion engines.
In 1968 Douglas Engelbart showed us the future. It took 50+ years and some large number of cycles to get here, and we're still cycling. There are plenty of good lessons to learn from those cycles, so let's not forget about them while we build the future.