English Amiga Board


Go Back   English Amiga Board > Main > Amiga scene

 
 
Thread Tools
Old 16 August 2011, 18:42   #1
lifeschool
Local Moderator
 
lifeschool's Avatar
 
Join Date: Oct 2009
Location: Lancashire, UK
Age: 48
Posts: 1,591
Happy INTERVIEW WITH DAVE HAYNIE - Insights On Future Technology!!

Greetings everyone,

You are welcome here today to bare witness to a brand new and exclusive interview with a hero of the Amiga scene - the legendary Dave Haynie. For those who don't know, Dave started on the Commodore 128, and continued through to the much heralded 'AAA' chipset- the successor to AGA. I took the opportunity to interview him.



> Hello Dave. I recently saw your 2005/2006 CommVEX/User Group videos - in which you are sadly underused as an interviewee (guess you noticed) on youtube. : Any thoughts on that?

Yeah... that was kind of weird. They actually paid for my trip out there, and then it was basically up to me to just hang around. I even broke out a guitar and played for them while they were taking the show down.


> So, before we gaze longingly into the future, let's begin in the past. Whatever happened to the idea of the Amiga Set-Top Box?

It's kind of happened. This goes way, way back to Commodore... of course We had brainstorming sessions back in the C128 days, and I had come up with the idea of a "living-room computer"... something that would integrate into your component stereo system. This was, of course, before video was a typical component of such systems, but at the time, I had one of the first televisions designed to do just that (from Panasonic's Technics brand).

So the idea for the STB I had in the late 1990s was effectively to put a personal computer in your A/V system, but just not call it a personal computer. Our early Metabox 500 was sort of a rough cut, but the never released Metabox 1000 was the real thing. This ran our own Amiga-like OS, CAOS (Carsten and Andy's Operating System), complete with MUI, Voyager, etc. The graphics chip I used did hardware picture-in-picture, so you could watch a DVD or DVB broadcast (and eventually, internet video) in a window or two without the need for hardware scaling. It even played MP3s


> So how do you decide which multimedia formats a system will use??

Back in the late 1990s, it wasn't entirely clear which media formats would dominate, and there was a break between consumer electronics and the internet. Today, you could cover nearly everything with the existing DVD/Blu-Ray formats; which you have to have anyway. Going a little deeper, since any STB system is going to be largely software driven, it's likely that important new formats, like maybe the WebM or Ogg stuff, is just "a simple matter of software".

Hardware overkill is why products often fail.. you don't need every interface built-in on a device, or it'll be too expensive for anyone to ever buy. No one who would buy such a product cares about video over RF anymore, other than maybe television reception (ATSC/DVB). No one's using Firewire anymore. LAN and USB, those are must-haves. SATA maybe internally, but there's no need for that to be external... USB 2.0 is fast enough for video, and these days, I'd go with USB 3.0 and just be done with it.

Any lesser interface out there probably had a bridge device already. For example, there are video capture devices that go from analog to USB, plenty of them. For the rare user who needs this, find one or two and ensure your STB works with them.

NO ONE is going to use such a device for professional video work... not even people like me who use their PCs for professional video work. I absolutely want my STB to be able to preview completed videos in many ways (eg, from memory card, over the net, etc) but I have no use editing from them. And if I were starting on this today, I'd be entirely happy with memory card (perhaps even just SD/SDHC/SDXC) and USB for video and photo capture, since that's the overwhelming standard today in consumer and even many professional products.


> What about video standards and other output 'compatability' issues?

SVGA is already a dinosaur. New systems offer only DVI, HDMI, DisplayPort, maybe Thunderbolt if it actually catches on beyond Apple. A full DVI port includes analog outputs, but that's kind of going away. It's already the case that analog inputs on a computer monitor are starting to vanish (ok, my two 1200p monitors actually have CVBS, Y/C, YPrPb, VGA, and HDMI inputs... but it'll be a challenge to find a monitor with any analog input other than VGA, and many are leaving that off).

You have to be careful about getting too crazy. If I charge every user for the 0.001% of people who want to hook up a turntable, I have lost already. On the other hand, if I build in software that works with, say, all Numark USB turntables and the cheap Behringer turntable to audio USB interface (well, hey, I have both of these on my desk right now), then I've handled the problem well enough to offer a practical solution, for perhaps a day's work on software drivers (if that).


> Was it difficult for the Metabox to handle very good quality streaming video in those days using the existing hardware?

Not so much. Hardware has been growing faster than networking. Back when Metabox USA was in talks with Blockbuster and Enron (in 2000) to deliver the STB piece for their vision of "a Blockbuster store in every home", we were talking about 1Mb/s video, with lots of local buffering (eg, you needed a hard drive) to deliver more-or-less-DVD quality video in MPEG-4 ASP, not even AVC.

Today, of course, DVD quality isn't difficult, but no one's delivering streamed Blu-ray quality yet. Of course, with the Apple-fication of media, there are lots of people who'll take something of lesser quality, now, versus having to wait for a day or go to a store to see it better.


> With TV station viewing figures continually on the decline, by 2002 you had the original idea of bringing the power of network streaming technology to the home user. Tell us about Fortele…

Springsteen wrote a song about "57 Channels (and nothing on)".. and sure, you can extend that to 500 channels on some days today. After Metabox, Andy [Finkel] and I had a startup called Fortele. Our goal was whole home media networking, where the big thing was the network. You'd hook devices into the network and they'd work like PC peripherals -- entirely subsumed by the network interface. Every device would have the same UI via our interactive system. You could watch or record any source of video on any video player in any room in the house, even take it with you (with enough warning, anyway).

DRM was probably the largest issue to deal with there... the DVD and satellite/cable providers didn't love the idea of ripping their resources and spreading them around the house. This ironically made this idea easier to eventually build with free tools on your own (all those folks supporting some kind of actual PC for the livingroom) then to commercialize.

We never got that far... it was bad business people that ultimately killed that project (bad at getting money.. it sometimes seems like you either get honest money people who are no good at their job, or bad-asses who bring in the cash, but likely as not to stab you in the back if they've got nothing better to do)



> That brings us up-to-today. In your view, which is better - Mac or PC? (or emulate both?)

Macs are PCs.. it's all just about the OS these days. And few people are worried about both -- why does some OS I'm not using have anything important to say on how I write a program for the OS I am using?


> Why, despite 64 bit speed-stepping six-core CPU’s, are modern computers still painfully slow to boot and operate?

There's two things at work here. They really are very fast. But that speed is always being used. Look at the work an Amiga had to do to boot... most of the code was already resident, it booted, not instantly, but in seconds, because it just didn't do that much before starting. Today's PCs have hard drives 5x faster than the fastest Amiga RAM, but they're doing 1000's of extra things before they boot. This could be changed, but there's no great incentive to do so, because you just don't reboot that often.

If you look at either Windows or Linux, they're doing lots of things at boot time, loading all kinds of services... there's a reason why an average week's updates for either are many, many times larger than the whole AmigaOS. To speed things up here, they could do what we did in later versions of AmigaOS -- launch the critical components immediately, then spawn off a background process to complete the other stuff.

That would work, but it would also cause problem in current-gen OSs. For one, applications aren't written to launch with a system not fully formed. If you took some of the idea of an OS like Android, in which every application has a resource manifest, and applied that to system resources as well, then this would all kind of "just work".... apps would launch, but know to sleep until the things they needed were fully available.

As for slowdowns in interactivity, that's largely due to poor system design. Keep in mind, for example, that Windows was designed for single-tasking systems. There are all kinds of things in Windows APIs that are designed to serialize things that should happen in parallel. In other OSs (used to be true of MacOS, not sure now), only one of N CPU cores can be the I/O processor -- only that one can handle hardware interrupts.

In short.. if you have slowdowns on the desktop, it's the software. And some of that's buried deep in the OS architecture... if we were all running AmigaOS or BeOS or something like it today, you wouldn't see much of it (though actually, there are some problems that have been exposed in the AmigaOS model that really start to drag a system down, too.. this is why software needs to evolve to match the hardware).

PCs evolve as fast as they can. And honestly, unless you're doing games or video or a few other very intensive tasks (I needed a memory upgrade to 16GB to really be happy with some photo editing I'm doing lately...but when you're merging 30-something 18Mpixel photos in 48-bit color, this happens), PCs have been plenty fast for the average user for quite some time.


> Would it not be faster just to combine the CPU and Ram (or HDD) into one unit?

That's not even remotely a near-term solution. But of course, systems have been evolving to manage delay, which is the real solution. As transistors get cheaper, you have more on-chip cache, more processors, etc. to ensure that, more of the time, you're not waiting on some hardware resource... or your wait is pushed aside to let other things do their work.


> So, perhaps it’s not the speed of the hardware but the speed of the OS which matters most?

You could come up with all kinds of sophisticated ways to speed an OS up, and they might do great -- if you already had an efficient OS. But the problems these days are more large scale. I mean, look at Windows design.

Like Intuition interfaces on the Amiga, every Windows application has a mandatory (unless it's a shell program) Windows message port. Windows uses this for damn near everything.. even signals sent asynchronously from one thread to another can, if you're not careful, get turned into messages on the message port. The message queue, too, is a serial thing.. you have to clear one message to get to the next one.

Now, in Intuition, we have some pretty good power.. an application registered with Intuition, but only for the messages it needed to see. Others were handled entirely by Intiution, and the message queue you did get was just for your application. Intuition could do a bunch of things, like handling window resizing, gadgets, etc.

Windows isn't that sophisticated. For one, every window in Windows is what Amiga calls a SIMPLE_REFRESH window... when something changes in the window manager, you get a message from the OS that says "redraw your stuff". Nothing else... no OS management of the window contents. This is why, when a Windows system gets busy, Windows don't erase, much less redraw, and even the gadgets don't get refreshed.

Now consider that, until XP or Vista (don't recall offhand), there was one single global message queue -- another aspect of back when Windows was single-tasking. So application A gets a redraw message. A well behaved application A would acknowledge that message, grab any others it might have, then signal a thread (without using the message queue again... but this isn't always an easy thing to do) to go and redraw the Window. Of course, a poorly designed application will use just the one thread, and hold up the whole message queue -- every other application on-screen -- while it does its work.

And Windows isn't the only culprit here... all current OSs have these kinds of issues. Even AmigaOS can get really slow over issues like layer locking (there's a video up on YouTube which shows MorphOS, essentially cloning the AmigaOS, failing against MacOS on opening multiple web browser instances, due to some of this stuff). There is a ton of room for optimization.

One problem is the very fact that OSs are not a big deal anymore. Most of the time, they do what they need to do, and otherwise get out of the way. And the actual value of a commercial OS as a product is close to zero... that's one big thing that did Be, Inc. in. So while there have been advances in OS technology, no one's really looking at efficiency issues, particularly on the desktop. After all, you know very well that next year's computer will be twice as fast (which actually doesn't help against these sort of software state-machine inefficiencies and deadlocks one bit. I also expect that the few people on the earth who understand these issues well are not working for OS companies).


> Some people think the OS should be the emblem and selling point of the machine - others consider the OS as nothing more than a background systems launch-pad; to launch applications as quickly as possible. I prefer the second option…

That's pretty much the role of OSs today... no one's really selling a machine on the OS other than Apple these days. And there are good historical reasons why.

Back in the 8-bit days, when you bought a computer, the OS would be something you used directly and every day, as a programmer. In the early days, no one bought a computer who didn't write at least some code for it. So you had to be concerned about the machine, you wrote directly to the hardware, etc... and you could actually learn the whole thing. In short, your hobby (which was 90%+ the point of personal computers in those days) was the computer itself. This is why so many people still mess around with the C64.

As computers evolved, new people started using them... and they didn't care much about the OS, they just needed to know their applications would run. That's even more today, with the additional fact that many of your applications can be web-based, so in fact the OS doesn't even matter sometimes for the old application-lock-in advantage (eg, the thing that's kept Windows alive all these years). I can surf the net, watch videos, etc. as well on my Android tablet as my desktop PC.. but the tablet moves with me, and even works well outside.

All of these things are making the OS less important. It's still an issue for application compatibility -- I don't care so much about running Windows, but I want Sony Vegas and Cakewalk Sonar and Altium Designer to run on my desktop.... for that kind of work.



> Looking more towards the future… Whatever happened to the promise of the Next Generation computer? We moved from 8 bit to 16, 16 to 32, 32 to 64. So when’s the Light-Speed computer coming out?

The next-generation computer is largely a myth... because of several factors. One is simply that computers today are very, very complex. Thus, they're broken down into modular components... exactly the kind of thing I was trying to do in the latter days of Commodore. One reason Amigas fell behind on hardware was that the whole system had to be re-invented for each new machine. The PC succeeded in splitting things up nicely. And many of these components, like system chips, CPUs, GPUs, etc. have become such a big deal, one company can only really do one or two of these well. And they're not starting fresh, either, but standing on the shoulders of last year's work.


> Would the Next Gen. computer have super fast ram; which would store data like Flash ram?

There is actually technology moving in that direction... magnetic RAM. This is known as MRAM or FeRAM (two different approaches), and this promises to essentially replace both flash (because its non-volatile, unlike RAM) and DRAM (because it's very fast, unlike flash) with a single memory type. This has been in development for probably two decades, but it's starting to show up in places. Texas Instruments, for example, has a version of the MSP430 microcontroller (the chip I used in the Nomadio Sensor and receivers) with MRAM replacing flash and RAM.

Basically, you could turn your computer off, and when you turn it back on, it would resume pretty much right where it left off. The one issue, of course, is that not every device in the system is non-volatile. So drivers still have to understand the power-down state, and re-initialize the hardware they manage.


> Would the Next Gen. computer have 1TB internal flash ram/rom?

1TB flash ROM today would cost thousands... there is absolutely no need for that much flash on a consumer device. Sony got it right with the PS3... ship a fairly useful sized HDD, but make it super easy to replace it with a larger one. Thus, my PS3 has a 320GB HDD... I can get something close to 1TB in the 2.5" form factor today, or even an SSD if I really want that (it offers no practical advantage other than running a little quieter... there's nothing on such a device that approaches the performance of an HDD).


> Would the Next Gen. computer be able to offer video editing on-the-fly?

Video editing on the fly is not something the average home user needs. We had plans to offer simple home video editing in the Metabox 1000...you'd capture your home video from camcorder, and have some facility for cut and conversion. Nothing too sophisticated, because a practical device for the livingroom just doesn't have the kind of power necessary. And there's very little practical value in mixing live video anyway... unless you're running a television studio, you don't need it. That's not how video editing works.

With that said, the PS3 does offer a free video editing app via download. And it's a bit better than what we were thinking of back in 2000. Of course, the PS3 is many, many times more powerful.

If you haven't noticed, most game consoles do this job very well today. I have a PS3 in my TV room... it's my DVD/BD player. If I lived in an area with reasonable internet services, I could watch all kinds of streaming video: internet, Netflix, etc. I can drop a video in from an SD card (ok, they eliminated the memory cards in cost-reduced models.. the fact mine has a card slot is no accident), stream it from my PC, etc.

In fact, I edit video on my PC all the time, non-realtime-editing of course, and with a 6 core AMD 1090T at 3.2GHz, it's not fast enough. Yet, for modern video editing, switching tracks/sources is virtually free. The overhead is in decoding and re-encoding (if you need to re-encode) the video, and compositing with effects. That's why folks like me have fairly good CPUs ... don't need that for electronics CAD anymore, but you need all the performance you can get for video.

Last edited by lifeschool; 07 September 2011 at 23:45.
lifeschool is offline  
Old 16 August 2011, 18:44   #2
lifeschool
Local Moderator
 
lifeschool's Avatar
 
Join Date: Oct 2009
Location: Lancashire, UK
Age: 48
Posts: 1,591
> Would the Next Gen. computer come without a mouse?

No... mice work fine for many purposes. Tablets for some uses, touchscreens for others.


> Would the Next Gen. computer have EMP (Electro-magnetic-pulse) protection? What happens if there's an EMP wave that destroys all the computers?

You won't have to worry about power anymore. And this will still destroy your computer, unless you intend to build it all out of RAD hardened components. No one will buy a $500,000 consumer appliance. If you want your media to survive the coming zombie apocalypse, or anything similar, do what I do... put it on optical backup, and store it in a good place.

Keep in mind a central theme at Commodore, Amiga, Metabox, and most places I've worked: high functionality products for a low price. This means only the set of features that will deliver the most benefits to the most people, as a built-in. And an easy way to add on extra stuff, as much as possible.


> Would the Next Gen. computer be kitted out with Virtual Reality? Are we to look forward to walking around with VR headsets and VR gloves on, and talking to our computer with voice commands?

Nope... no one wants a glove. Just in general, I hate 'em .. even winter gloves. I don't want anything covering my hands unless absolutely necessary. And no audio interface will ever work well, apart from perhaps speech interfaces on small personal devices.

At Fortele, we had a speech interface on the remote control... you had a speaker and mic. The control could be used as a phone or intercom, but also for voice macros to the system. But it also had buttons. Saying "NBC" into my remote to get to NBC would be cool... having to say "up-up-left-select" to operate the UI would be an epic fail. The touch UI on Android is very nicely augmented with voice input, for search, etc. Works just dandy... I can make a phone call faster with voice than touch.

But this only works because it's a closed system -- in both cases, I'm pushing a button to capture the command, and I'm talking privately.... so the system only has to hear my voice commands. An open mic on a desktop or monitor will have sort between my commands, the sounds on the television, the guy in the office next to me, etc. This is precisely why every audio input idea designed for general purpose use has failed.

Of course, if you want a voice interface and a glove, these are technically pretty easy to add to existing systems already. As mentioned, it's already built in to Android, and there's at least some audio UI built in on Windows. A glove would just be another USB controller. I've seen a few used for immersive VR applications -- the only practical way, so far, to link my real hand to my avatar hand in such a system. If we're operating our desktops in immersive stereoscopic video at some point, maybe I'll retreat on the glove rejection a little bit. Of course, we're already seeing, via "Minority Report" (the 2002 film that got pretty much everyone, Apple, Microsoft, Google, etc. working on touch screen interfaces) and via MS's latest gaming stuff (though this goes all the way back to "Mandella" on the Amiga in 1986 or so), that computers are going to get really good at reading hand gestures without the need for anything as annoying as a glove.


> Will Next Gen. video screens shrink to the size of I-glasses and other personal headsets, or will they grow and expand to the size of IMAX resolution screens? What about 3D?

[I-glasses and headsets] are very low resolution compared to screens. My 24" monitors are actually larger in my visual field than the typical IMAX screen... it's all based on seat position vs. screen size. The minimum for an IMAX movie is less than you might think. IMAX film has an incredible resolution... it's shot on 70mm stock, but the 60-something-mm edge is the short edge (eg, the film runs horizontally through the projector.

But for digital, the basic IMAX 3D film uses two 2K projectors, one for each eye. A 2K projector is nominally 2000x1000 pixels... roughly the same as HDTV. Some theaters (IMAX or not) are using 4K projectors... that's nominally 4000x2000 pixels. That's the absolutely state of the art in theaters these days... though NKK in Japan is actually working on an 8K system.

Again, the head-mount displays are useful in the fully immersive 3D context. You may not have much resolution per eye, but of course, with good motion tracking, you can have a very large effective screen. And that's what it would take to get me in one of those rigs. I think right now, we're not quite there computationally, and we're definitely not there OS-wise. But that could be a cool way forward. Particularly if some of the better micro-display technology is pushed, to deliver better resolution in such headgear.


> Will the Next Gen. computer allow me to interface my exercise bike and rowing machine so that I can exercise in full VR while competing online?

Of course, once you're doing all that, you're getting to some very hard core gaming add-ons, not something the average computer user is going to care about.


> They say todays LCD TVs and monitors aren’t built to last into the future – what’s your take on this?

No.. modern LCD screens are exceptionally stable, and will last very, very long, LCDs really don't age in any significant way, at least on a human scale... other than the fact that electronics themselves may eventually corrode and fail due to environment (but being fairly cool, an LCD will easily outlast your computer). The CCLF backlights have an MTBF of 50,000-100,000 hours.. that's awfully close to "forever", given that you're more likely to replace the screen for other reasons (I can get a better screen for $150) than due to failure. And LED backlights might well last past 1,000,000 hours.

And they're cheap... a 20-22" 1080p monitor can be had these days for under $150.


> The concept of OLEDs (Organic Light Emitting Diodes) is said to offer a broader spectrum of light over LEDs. So which is the more ‘future-proof’, LED or OLED?

I love the idea of OLED. There are still some technical issues, relative to screen life and larger displays, but I think it's one of the best possible large-scale replacements for LCD.

But why replace LCD? The native contrast on an LCD panel is usually 1:500 to 1:3000... some OLED displays do 1:1,000,000 or better. So the LCD panels today are using an array of modulated LED backlights to extend the contrast. As the LEDs get smaller... well, you're kind of moving to an OLED display anyway

Going further ahead, there are a bunch of companies working on printable OLEDs. This would start to deliver things like wrap-around displays, disposable displays, super high quality, low cost large screen TVs, etc.


> So are we going to be able to bend, fold, and shape our mobile devices to whatever design we prefer? - transforming our phone from a wristwatch/bracelet style to a touch-screen, and then folding it up again to slip it into our pocket – like the Nokia Morph concept?

Of course the Nokia Morph is just a concept -- they can't build it yet. I think there's some room for this, but no evidence so far that it's not just a gimmick, at least this way.

On the other hand, there's going to be an increasing effort to make your digital devices always-at-hand. It's taken a good 30+ years of trying to deliver a good pocket computer, and we still don't quite have a viable Dick Tracey "2-Way Wrist TV"... but eventually, someone will have the iPhone or Droid of wrist-mounted smartphones... lots of problems to make one of these useful: power, screen size, etc.

I've commented on my take about the leading things in the personal computer industry. The 1970s created the basic form of the personal computer, but still largely a hobbyist thing. The 1980s gave us the modern personal computer: multitasking, GUI, a something a non-expert would use. The 1990s saw the personal computer largely catch up with human needs -- gamers, multimedia authors, scientists would still push the limit on computation (in fact, as of the mid-1990s, PC gaming was the primary motivating factor on nearly every PC innovation).

So by the 2000s... it was basically all about case design. The PC was done. Sure, new OSs, new chips, etc. But the same kind of things, just faster. No major breakthroughs in computing.

That leaves us with the 2010s... the post-PC decade. Those who weren't playing close attention didn't realize that a circa 2000 PC was fast enough for nearly every user. So they got surprised when much slower devices, like Netbooks and ARM Tablets and smartphones, magically started also being fast enough for the average user.

This has the uncomfortable requirement that case design is still a primary motivating factor in unit sales. Apple understood this earlier than most PC companies... in the latter days of the PowerPC, they were still getting 2x-3x the price for their machines, versus much faster bog standard PCs. But the casework -- superb.


> Would the Next Gen. computer offer further Cloud Computing functionality? Is Cloud Computing something you would advocate?

It has its place.

I think the basic Internet is working out pretty well, and the fact that Internet applications are doing more server-end computing is fine. There are some things, like web search, that only work as "cloud" applications.

It's also the ideal back-end for mobile services and sync. A mobile device, like a smart-phone or a tablet, could very well become the only computing device some people need. But historically, companies like Apple, Microsoft, and Palm treated these essentially as PC peripherals. There was a time when you could do virtually nothing stand-alone: no app or music purchases, etc. If you wanted a backup, even on a smart-phone, you'd have to sync to your PC.

Android was instrumental in showing how this could be different. Google, of course, had no attachment to the PC world. It's not universal, but to a large extent, things just work right on Android. If I buy an application via my PC or other web device, it'll just show up on my tablet or phone.. but I can buy on the tab or phone just as easily. If my smart phone is lost or stolen, I get much of the old phone's environment just as soon as I authorize my phone. Note I take on these devices automatically and instantly sync to my other devices.

On the other hand, I don't believe cloud computing will replace local computing for most applications. Some corporations are drooling at the prospect, but they never really dealt with the move from mainframe to PC, or the power that gives the PC user. Cloud-based computing will give these people the central authority they like... only problem is how much of that control they have to concede to a cloud provider, rather than run themselves. I see some advantage for businesses doing number crunching, to be able to buy computing power as-needed, rather than buy for the peak and let all that hardware sit around at other time. But curiously, that's been part of the reason for cloud computing in the first place. For example, Amazon has to have the servers to meet their Christmas season order demands, or they're doomed. So at other times of the year, they commoditize that extra capacity.

And of course, you're not always online. Or the data demands are too high... I don't see video editing as a useful cloud computing function, given the hours it would take me just to upload a video. But email -- never a big computational deal anyway, unless you were Microsoft (Outlook is more bloated and less functional than most emailers I've used). And there are some advantage to the web orientation. I didn't really believe this until I started using GMail quite a bit... it's really a good system.


> How about HTML 5? It’s been a long time coming. Is it any good?

HTML5 is certainly a good thing. The Internet, like any other piece of technology, needs to evolve. HTML5 solves some (not all) of the issues we've turned to proprietary solutions previously, like Adobe Flash. And it's pretty clear this is well within the capability of most PCs, or for that matter, most smart-phones today. So why not?

This is sort of the idea of evolution that we've had all long in the PC industry. PCs have their slots and ports, web browsers have their add-ons and plug-ins. Once a function is useful enough that pretty much everyone needs it, it ought to be in the web browser, and it if makes sense, in HTML itself.

When I was at Metabox building my set top box, we had a version of the Voyager web browser with some extra HTML tags. With a couple of lines of HTML, I could create a video window of any size, and launch a video from any hardware player in the system (DVD, DVB, internet MPEG/MPEG-2). Little things like this allowed the STB to basically live "in" the web browser all the time.. the main UI and all that... just web pages. Very easy to build, seamless transition from what's on the device to what's on the web, etc.


> Looking at the very distant future now… How do you see Nano technology evolving? Are we ultimately heading towards computers interfacing with the Human brain?

These are two separate issues. I'll take the latter one first. If I set you up in a room, alone, and give you a general knowledge quiz... some questions from "Jeopardy" or something like that, you might do ok. If you're in there with a computer and access to any reasonable modern search engine, you'll do substantially better. In this way, the computer, just in this one context, functions as a very crude brain amplifier... it's not thinking for you, but it's allowing you to access information far more rapidly and accurately.

I think there is absolutely no chance that, eventually, we'll have computers integrated into the brain in such a way that they function directly as brain amplifiers... I think of a question, I get answers, pretty much as if I had thought of them. I work on a problem, and my ability to visualize, in my mind, will feel much like it does now, but things will be persistent, like on a computer... that image I picture, that schematic, that short film storyboard, etc... will still be there, in my mind, days, weeks, or months later. But the function will be far more organic -- a more sophisticated kind of brain amplifier.

This is normal evolution... every creature that makes artifacts evolves with those artifacts. And we make the best ones. Its affected us profoundly already... humans used to live to an average age of 35, then 45, etc.. and now it's up there in the 70s or 80s, depending on where you live. And that, too is just a primitive thing -- better knowledge of staying healthy, better treatments for fatal diseases, better surgery, etc.

One use for a nano-machine might be pushing this limit to whole new levels. The classic idea of nano-technology is a machine of some kind, possibly a computer of some sort, that's built on a nanometer scale. Such devices are so small, they might last indefinitely. Some models have them self-replicating, others no so much. But either way... imagine a sea of nanobots in your bloodstream, all networked, working to keep you healthy. If they see high sugars or cholesterol levels in the blood, they physically remove those compounds. If they find fat deposits in the arteries and veins, they destroy them. Same with rouge cells (eg, cancers). Just doing this, in fact, you'd wipe out several of the causes of death in western societies: no more heart disease, no more cancer, no more diabetes, etc. And they'd probably fight off infections, so most standard disease is gone too. And that's before we learn to make them rebuild shortened telomeres, or anything really clever like that.


> Finally, when will I get my Holodeck?

I'm still waiting for my flying car!

fin.

-- Many thanks to the wonderful Dave Haynie for taking the considerable time and effort to reply to all those questions! Thank you for a terrific interview!! --

Also mirrored on Lemon, see this thread.

Last edited by lifeschool; 16 August 2011 at 18:52.
lifeschool is offline  
Old 16 August 2011, 19:25   #3
Amiga Forever
Registered User
 
Join Date: Jan 2010
Location: UK
Posts: 228
Thumbs up

What a Great to Read

p.s. I am glad that I remember my passwords to get in Amiga English Board and yes, I am Back
Amiga Forever is offline  
Old 17 August 2011, 16:08   #4
arnljot
Registered User
 
arnljot's Avatar
 
Join Date: Jul 2008
Location: Oslo
Posts: 76
Awesome read. And it´s really interesting to see his perspective both on the history and the future of computing, very cool indeed!
arnljot is offline  
Old 17 August 2011, 16:35   #5
Amigaman
Registered User
 
Amigaman's Avatar
 
Join Date: Aug 2010
Location: London
Posts: 124
Quote:
Originally Posted by lifeschool View Post


...So by the 2000s... it was basically all about case design. The PC was done. Sure, new OSs, new chips, etc. But the same kind of things, just faster. No major breakthroughs in computing....
I was asked once ina job interview, about 10 years ago, something along the lines of "what or where do I see technology going over the next few years?" I said basically the same thing; not many innovations to come, and just revisions and speed increases. The interviewer looked at me as if I had suggested burning the office down. I didn't get the job.

This has been a great read, and I have to agree on pretty much everything. It is amusing to see new CPUs comming out with a fraction of the power of the top end ones that can do everything that you need. I wonder where else technology can be stripped back to functional norms instead of making things more and more complicated. Whatever happened to RISC?

Great read
Amigaman is offline  
Old 17 August 2011, 18:51   #6
RobertB
 
Posts: n/a
Quote:
Originally Posted by lifeschool View Post
...then your videos at the 2005/2006 CommVEX/User Group...
Dave Haynie came to the July 25-26 CommVEx 2009.
Quote:
...(in which you are sadly underused as an interviewee - guess you noticed) on youtube.
We do not post on Youtube. CommVEx videos go to Blip.tv, due to the length of the videos.
Quote:
Yeah... that was kind of weird.
I didn't know that Dave was waiting for the videos to be posted. FWIW, most of Dave's CommVEx 2009 videos have not yet been posted up to Blip.tv . Hollywood filmmakers, Rory Muir and Jerold Kress, were also there to film, and they interviewed Dave; they, too, have not completed editing the mountains of video they took. However, Michael Battilana of Cloanto was there and took a short video of Dave playing the guitar. It can be found on Facebook under the Commodore Vegas Expo (CommVEx) 2009 area set up by Mike.
Quote:
...then it was basically up to me to just hang around. I even broke out a guitar and played for them while they were taking the show down.
We're all pretty casual at CommVEx. If Dave wanted to give a formal speech, he could have done so. However, at the beginning of the show, we just gathered some chairs around him and listened to him talk and asked him questions. That went on for awhile.

Truly,
Robert Bernardo
organizer - Commodore Vegas Expo
http://www.portcommodore.com/commvex
 
Old 18 August 2011, 11:24   #7
chocolate_boy
Registered User
 
Join Date: Dec 2005
Location: UK
Age: 41
Posts: 230
Quote:
Originally Posted by RobertB View Post

We do not post on Youtube. CommVEx videos go to Blip.tv, due to the length of the videos.

http://www.portcommodore.com/commvex
The 10/15 minute time limit on Youtube was removed over a year ago, you can post videos of any length now, providing you haven't infringed copyright and file is under 3.2 gigs or so.
chocolate_boy is offline  
Old 18 August 2011, 14:46   #8
lifeschool
Local Moderator
 
lifeschool's Avatar
 
Join Date: Oct 2009
Location: Lancashire, UK
Age: 48
Posts: 1,591
Quote:
Originally Posted by RobertB View Post
Dave Haynie came to the July 25-26 CommVEx 2009. We do not post on Youtube. CommVEx videos go to Blip.tv, due to the length of the videos.
The videos I was talking about in the first question can be found here:

VID - CommVEx 2005 - http://video.google.com/videoplay?do...54351342258637
VID - CUG - 2006 (1) - [ Show youtube player ]
VID - CUG - 2006 (2) - [ Show youtube player ]

I haven't seen anything posted more recently than these.


Quote:
Originally Posted by chocolate_boy View Post
The 10/15 minute time limit on Youtube was removed over a year ago, you can post videos of any length now, providing you haven't infringed copyright and file is under 3.2 gigs or so.
Just to save any further confusion over this, in the UK at least, the 15 minute limit is still in place. I tried to upload a file just yesterday which was 15.48:00 and it got rejected. The vid was 800mb in size. I cut it down to 14.59:12 and it went through fine.

Last edited by lifeschool; 18 August 2011 at 23:49.
lifeschool is offline  
Old 18 August 2011, 16:50   #9
asymetrix
Registered User
 
Join Date: Jul 2009
Location: UK
Posts: 112
Good interview.

I really got excited when I heard about it, I thought that Dave created a new Amiga motherboard or something !

I woud have liked that, a current take on Amiga hardware, or learn something unique about AmigaOS and technical ways to improve it to current/future standards.

Its good to know about to the past, but more important to me is current developers and maybe a video or two of official AmigaOS developers in action, learn about their workflow etc.

Anyone made a video of Hyperion HQ ?
asymetrix is offline  
Old 18 August 2011, 18:46   #10
kriz
Junior Member
 
kriz's Avatar
 
Join Date: Sep 2001
Location: No(R)Way
Age: 41
Posts: 3,185
Great interview, such nice ideas and information!! Thanks.
kriz is offline  
Old 18 August 2011, 19:31   #11
Photon
Moderator
 
Photon's Avatar
 
Join Date: Nov 2004
Location: Eksjö / Sweden
Posts: 5,602
Interviewing someone who will be driving the technologies speculated about would have been more worthwhile. It came off as someone speculating having an ordinary chat with someone speculating, and sometimes a laundry list of what ports a new computer should have... and all over the place: nano tech, html 5, what the heck?

I think he did good work at C=, this just sounded like two guys having a normal chat going "so Jeb, what do ya think about this newfangled acronym or other".

Those who are thrilled about hearing something from Haynie have some nice reading to look forward to I'm sure, so
Photon is offline  
Old 19 August 2011, 05:27   #12
desiv
Registered User
 
desiv's Avatar
 
Join Date: Oct 2009
Location: Salem, OR
Posts: 1,767
Quote:
Originally Posted by Photon View Post
It came off as someone speculating having an ordinary chat with someone speculating,
Assuming that the "someone" involved was in the right place at the right time in computer history, then "yes", I agree..

If that someone is your buddy at work who has worked on PCs for years, then "no", it's not the same..

It was a bit all over the place, but I enjoyed it. Thanx!

desiv
desiv is offline  
Old 20 August 2011, 12:26   #13
lifeschool
Local Moderator
 
lifeschool's Avatar
 
Join Date: Oct 2009
Location: Lancashire, UK
Age: 48
Posts: 1,591
Quote:
Originally Posted by desiv View Post
It was a bit all over the place, but I enjoyed it. Thanx!
Yeah, it did turn into a bit of a monster. I tried to keep the questions in a logical order, e.g. old tech, modern tech, new tech, and I threw in a few wildcards like the EMP question (which doesn't fit anywhere but I kept it because of the witty response). Cloud computing was a hot topic not so long back, and I wanted to ask him about the latest advances in memory, VDU technology, and that old perennial; Virtual Reality. I'm more than happy with the results, and I think we managed to cover a lot of ground. I'm also glad I asked the $64M question: why are computers slow - and finally understood why.
lifeschool is offline  
Old 20 August 2011, 19:16   #14
RobertB
 
Posts: n/a
Quote:
Originally Posted by lifeschool View Post
The videos I was talking about in the first question can be found here:

VID - CommVEx 2005 - http://video.google.com/videoplay?do...54351342258637
VID - CUG - 2006 (1) - [ Show youtube player ]
VID - CUG - 2006 (2) - [ Show youtube player ]
Those videos premiered at CommVEx, but Dave did not attend the show in those years.
Quote:
I haven't seen anything posted more recently than these.
As mentioned above, there is a video posted by Michael Battilana, and it can be found on Facebook. I have not been given permission to release that generally.

Truly,
Robert Bernardo
Fresno Commodore User Group
http://videocam.net.au/fcug
 
Old 21 August 2011, 00:53   #15
Photon
Moderator
 
Photon's Avatar
 
Join Date: Nov 2004
Location: Eksjö / Sweden
Posts: 5,602
Quote:
Originally Posted by lifeschool View Post
why are computers slow
Computers are insanely fast, software is slow because of many accumulated layers both in the apps, the apis and the OS, and the choice of tools, over 3 decades. The professional programmer will defend the layers and tools with miscellaneous reasons: working in teams, project budgets, established (and forced by market share 'moves') standards. When in fact it's his boss talking.

Another defense is that software is good because it works with so much (interfaces, peripherals), but this is just a natural process of makers of interfaces and peripherals flocking to established standards.

Right now we have viral internet and free-thinking individuals, but the above facts are now so entrenched they are hard to budge, by a new OS, programming language, standard, or whatever you please to choose. Decades ago, we had only snailmail and computer newbies or in the best case computer mediocrities in suits countering the pre-emptive moves of startup corporations. Else there might have been no Windows, Mac, or Linux at all but something really nice where layers had been kept strict and few.

But Mr. Perfect Programmer must also be prepared to tackle the assault of all the 10000 needs to be satisfied per year. This is the best we got out of that, we could certainly have done better without corporations breaking international law, but let's reconcile ourselves with the fact that the shit is working most of the time, and lets us get on Facebook. Right?

Not that any of the stuff I do is slow on any of the computers I have, I don't know what you're using ... I could almost have agreed with that in 2002.

Let me know when you want an interview. I won't share my thoughts on nanotech, because I know my limitations.
Photon is offline  
Old 21 August 2011, 01:22   #16
desiv
Registered User
 
desiv's Avatar
 
Join Date: Oct 2009
Location: Salem, OR
Posts: 1,767
Quote:
Originally Posted by Photon View Post
<Generic tech talk from some tech guy who's impressive on a forum>
...
Let me know when you want an interview. I won't share my thoughts on nanotech, because I know my limitations.
And this is why people are interested in reading Dave Haynie's replies and yours.. well, not so much... No offense..




desiv
desiv is offline  
Old 21 August 2011, 03:12   #17
Photon
Moderator
 
Photon's Avatar
 
Join Date: Nov 2004
Location: Eksjö / Sweden
Posts: 5,602
Well, replies have to be impressive to impress beyond fandom. Maybe if the questions had been specific.
Photon is offline  
Old 21 August 2011, 04:47   #18
desiv
Registered User
 
desiv's Avatar
 
Join Date: Oct 2009
Location: Salem, OR
Posts: 1,767
Quote:
Originally Posted by Photon View Post
Well, replies have to be impressive to impress beyond fandom. Maybe if the questions had been specific.
Oh. I get it. Now if someone says they were impressed, they are fanboys...
You're clever.


desiv
desiv is offline  
Old 21 August 2011, 22:17   #19
Photon
Moderator
 
Photon's Avatar
 
Join Date: Nov 2004
Location: Eksjö / Sweden
Posts: 5,602
Quote:
Originally Posted by desiv View Post
Oh. I get it. Now if someone says they were impressed, they are fanboys...
You're clever.


desiv
That's way too suspicious of you. No, that's not what I meant all and, and I hate such games. I'm even a bit of a fan of Dave - just for being involved in Amiga hardware dev. "Fanboy" is your exaggeration from "fan". I do confess being a fanboy of Jay Miner's but if this ubercool dude was here today, I'm still not so sure he could predict future techs, definitely not which ones will be anything but a footnote.

Quote:
Originally Posted by desiv View Post
If that someone is your buddy at work who has worked on PCs for years, then "no", it's not the same..
Well, that's just it. The Q & A reminded me of exactly that, the thing about this age is that there are no real upsets or secret tech being worked on that's not leaked in an orderly manner. Huge companies do the math, then have the necessary tech dev done and tell standards institutes what will be standard in future, all kept secret by NDAs, then it's worked on, and then it's finished.

So future predictions are best done by people with insight into company project budgets, mass component purchase figures, purchases of asian factories rather than "some really cool tech info". That's why most such questions are answered on the PC buddy level even by experts, that's not anyone's fault.

Any cool gadget/invention/peripheral that should get real attention at a CES is quickly either bought up+converted into a big brand name product or prior art prosecuted/bought up+ditched. This gives the companies with big money the edge, so they can keep delivering incrementally newer tech. And as we know, at least when standards have not been established for years, this buying up of tech sometimes brings nice new tech to us In the Asia markets and niche markets there are still some progressive companies having success.

This cynical reign didn't start now; the first real inkling came with criminal acts committed in the 80s and 90s but took real hold around the failed battle between OpenGL and DirectX in 2001-2002, and as long as there is big money in computers and software, I can't see it ending until an international law is passed that outlaws using economic power and resources as nuclear weapons between countries and markets, the US Patent laws comply with international law, and laws are passed to ensure that big and small companies are equal before the law.

I don't see many more attempts to break existing laws to grab market share in the future though, for a number of reasons. That's a comfort, at least. And there will be exceptions, mainly on the software side, for a while yet.

But I digress. (And perhaps depress.)


This is it, though.

Last edited by Photon; 21 August 2011 at 22:26.
Photon is offline  
Old 22 August 2011, 03:49   #20
desiv
Registered User
 
desiv's Avatar
 
Join Date: Oct 2009
Location: Salem, OR
Posts: 1,767
Quote:
Originally Posted by Photon View Post
That's way too suspicious of you. No, that's not what I meant all and, and I hate such games.
Fair enough..
I've seen some people who have taken offense to some of his statements really rip on him, regardless of the topic..

Yes, the questions did wander, but it didn't bother me..
You had said "replies have to be impressive to impress," which sounds obvious, but I disagree...
Or I should say, I think that's a limiting..
Replies don't have to be impressive.. At least not for me..
They just have to be interesting/entertaining...
Impressive is a nice side benefit..

His perspective and experience makes those replies interesting to me.

Now, I'm not going to rush back to work and yell, "Wait, our whole client/server vs web methodology needs be changed because Dave said.."

But it's a different perspective that is worth hearing and is interesting to me..

Maybe it's because trying to "predict" future tech is part of what they pay me for (crazy, huh?) and I find many perspectives interesting..

And some impressive..

desiv

Last edited by desiv; 22 August 2011 at 04:37.
desiv is offline  
 


Currently Active Users Viewing This Thread: 1 (0 members and 1 guests)
 
Thread Tools

Similar Threads
Thread Thread Starter Forum Replies Last Post
Insights on how accelerator cards work on Amigas jman support.Hardware 12 27 April 2011 22:18
Dave Haynie and SMD caps matthey support.Hardware 37 24 April 2011 10:17
Dave Haynie's blog at Connectify AmigaFriend Amiga scene 5 16 February 2011 00:19
Haynie Auction Twistin'Ghost MarketPlace 5 17 October 2001 01:24

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +2. The time now is 08:21.

Top

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.
Page generated in 0.23849 seconds with 15 queries