IT Spot

What Once Was State-of-the-Art Is Now Obsolete

There was a time when my high-end workstation was the pinnacle of technology, but today, it’s little more than a paperweight. I recently had to pay just to dispose of a computer that’s over a decade old. It’s frustrating to see how quickly technology advances, rendering once cutting-edge devices obsolete. These days, I’ve stopped chasing the latest upgrades. I only consider upgrading my workstations or computers when I genuinely need the extra power and speed, such as for Bitcoin mining or video rendering. The pace of hardware development seems to have finally outstripped the demand for speed in areas like video editing, computational tasks, and gaming. Still, I can’t deny the thrill I get from benchmarking a new computer. But for now, my budget is the only limit, and here’s a rundown of my current setup as of August 2024.

Home Server

Believe it or not, I’m still running a Xeon-powered Windows Server to manage files. With so many cloud-based programs and storage solutions available, the need for a large NAS unit has become almost archaic. Who still backs up important data on thumb drives, optical media, or hard drives? I’ve been running hard drives 24/7 for over 50 years, and if there’s any cosmic justice, I might pay for my high carbon footprint in the afterlife. There was a time when I hosted my own mail server and even ran my website from my garage, but those days are long gone.

Editing and Home Computers

I’ve got a mix of current Intel i7 and Ryzen-powered PCs primarily for gaming. I mainly play Meta Quest 3 VR games, while my son is into Steam games and always craves the speed of a good graphics card. These days, I’m more interested in making money with NVDA stock than getting excited about its GeForce RTX 4090. Gaming is plenty fast for me now. At 68, I prefer Meta Quest VR games and my golf simulator over the latest gaming hardware.

Speaking of the golf simulator, it’s built around the Garmin R10 Launch Monitor, a relatively affordable option. After testing it on the range, I was quite satisfied with its performance. Over the past year, third-party software companies have started supporting it for full-size projection golf simulation. For this setup, I use an Epson projector with an old Apple TV attached. The projector is configured to AirPlay from my phone or iPad, making it easy to switch from handheld to full projected golf simulation with the Awesome Golf app.

Video Editing Setup

For digital video editing, I upgraded from an iMac Retina Pro i7 to an M1 Studio. The M1 Studio makes editing in Final Cut a breeze, especially across three 27-inch high-resolution screens. These days, I’m more into editing 360-degree camera footage from the Insta360 X3 and X4 rather than using my GoPro. Although I still own several DSLR video cameras, I’m content with 4K editing and don’t anticipate upgrading to 8K workflows in my lifetime—though I’ve learned never to say never. My current 4K workflows are fast and more than adequate on my Studio and MacBook Pro M1 laptops. I’m aware of the improved rendering times with the M3 chips, but for now, my current setup works just fine. Instead of upgrading, I’ll continue investing in NVDA stock and revisit the upgrade question in a few years.

The Rise of Skynet and how Battle Star Galactica survived the Cylons

In the fictional saga of Battleship Gallactica written in 1978.  The survival of the Battlestar Galactica during the Cylon invasion is a testament to the advantages of older, analog systems in a world increasingly dependent on digital technology. Unlike the rest of the fleet, Galactica’s outdated systems were not networked, making them immune to the Cylon’s sophisticated cyberattacks. This allowed the ship to evade destruction and become a sanctuary for the remaining human survivors.

This scenario mirrors the modern-day reliance on cloud services, which, while offering convenience and scalability, are heavily dependent on the speed and security of internet connections. Cloud infrastructure, though robust, is not invulnerable to cyberattacks, data breaches, or connectivity issues. The reliance on these systems exposes data to potential risks, much like the networked ships in the Battlestar Galactica universe were vulnerable to the Cylons.

In contrast, maintaining important data on local hard drives offers a level of security that cloud services cannot match. By keeping data offline, it is shielded from potential online threats, ensuring that it remains safe from hacking, ransomware, and other cyber threats. This approach also provides control over data access and reduces dependency on external networks, offering a reliable fallback in the event of internet failures or security breaches.

The Old Tech Still Stands

New tech isn’t always better. After half a century practicing dentistry, I’ve seen what lasts and what doesn’t. Take the silver fillings I placed in my parents’ teeth at the start of my career—still going strong in my 95-year-old mother, who isn’t suffering from Mad Hatter’s disease. Those simple, cheap fillings have outlived the fancy composites that were supposed to replace them. Better? I think not. Yet, they don’t even teach silver fillings in dental school anymore.

The same holds true for technology. Software today is so bloated and monitored that it’s become inefficient. A simple computer put a man on the moon, but now, our spacecrafts are so complex that no single person can understand or monitor them fully. Windows 1.0 had around 200,000 lines of code when it launched in 1985. Today, Windows 11 has 50 million lines—code bloat. Now, only AI can wrap its mind around it. Even Jensen Huang, the CEO of NVIDIA, admitted that future chips won’t be designed by humans but by AI.

The old sci-fi writers got it right. The real threat from AI isn’t that it’ll go rogue and wipe out biological life. The danger lies in code bloat. It comes when an AI with a critical mission makes a mistake, or when humans find ways to exploit its loopholes.

I see the future of personal computing shifting from powerful workstations to data centers. And I should know—I was part of the team that wrote Winamp during the Napster years, back when half the world’s bandwidth was used to pirate music that only played on Winamp. If you want something done now, you talk to a data center with a simple, low-tech interface. AI isn’t a bubble, no matter what the skeptics say.

Even ransomware will soon be history. Combining real-time data redundancy in the cloud with blockchain technology creates a strong defense against these attacks. With constant backups in multiple locations, compromised files can be restored quickly, reducing downtime and data loss. Blockchain adds an extra layer of security, creating an immutable record of all transactions and data changes. This transparency makes it nearly impossible for attackers to alter or encrypt files without being detected. Together, these technologies could make ransomware ineffective, ensuring that data stays secure and recoverable.

I’ll make a bold prediction: the future is all about AI-powered augmented reality. Imagine having conversations with someone who doesn’t judge presidential candidates by their hair color or personality—something that seems to sway nearly half the population. I want an augmented reality companion like Commander Data from Star Trek or Cortana 2.0 from Halo. Has anyone noticed how ChatGPT-4 is less biased? It’s a pretty good fact-checker and can be reasoned with if you present evidence. I have high hopes for my grandchildren.