View Single Post
Old 04 January 2013, 00:33   #23
Photon's Avatar
Join Date: Nov 2004
Location: Hult / Sweden
Posts: 4,501
Originally Posted by Mrs Beanbag View Post
If one has reserved memory using the OS functions, it is surely guaranteed the SSP won't point into it, surely? And it is possible to use interrupts in an OS-friendly way, VBL interrupts at least.
Nothing stops any stack from growing downward until it overwrites other running programs and the OS data in reserved memory. All it takes is running some fractal generator with deep settings or Quicksort a large dataset, anything that uses the stack. This is why you have the Stack command. If you don't specify a large enough stack, it will of course go down into the next lower reserved memory area.

If you start an interrupt in an OS-friendly or OS-unfriendly way, it's OK as long as you know you won't be overflowing the stack - if you allocate memory for everything in your game.

Originally Posted by Mrs Beanbag View Post
I guess people were still programming the way they always had on the 8-bits. Up until that point home computers didn't really get upgrades
You're missing the point. Nobody can make future-compatible software, not even OS-compatible software. C= was unable to make future-compatible OS software for its own future models planned by them. Today, modern computers are still updated, and only major software packages are updated to follow suit. "Why can't I play Quake I directly from the floppies anymore? -Grmbl, shitty programmers." <- this is nonsense. And Id is actually still around!

The golden age for Amiga games was very short, only from 1985 (or realistically, 1987) to 1991 when consoles took over. It's ridiculous to expect more than 5-year support and updates from a 4-man company in Germany, France or UK for a game that cost 25 GBP.


The reason why you take over the system and use all the memory is not because you're "oldskool", but because non-trivial games require more than the ~200k left on a 512K Amiga when Workbench is loaded. If you do Workbench games, you're going to get the graphics and sound awesomeness of Sim City and Marble Madness - unless you stopped supporting the million A500s and upped the hardware requirements, leaving you 1% of the market. (Today, replace "market" with the word "audience" - although now we have more expanded machines, your hardware requirements determine your audience. AGA-060? Couple hundred can run it, if that.)

Fastmem didn't (and largely doesn't) really help games much. Here's why: You could put all the code and mapdata in it, but that would likely only use 150k for a big game. You couldn't have graphics, sounds, or misc. buffers for such in it. The point is that loading from disk was and is a replacement for loading the whole game in memory. And this is why loading from disk is an advantage to 'just write huge binary, who cares if people have memory for it'. (And yes, the reason it's trackloaded from disk and not file-loaded is that a 10yo could copy it in 5 minutes then.)

Today, the reason to make a game disk-loaded is because those who have an A500/A500+ usually don't have a harddisk interface. But the major reason is that few companies are still around that could produce a massive game that requires more than 1-2 disks of space. If the game doesn't require a harddisk to run, you shouldn't require one.


"NASA could get the HRM!". My answer to that is: well, what if you're not NASA, or not even a major developer?

If you lived outside the USA, unless you got hold of C= USA's telephone number (how?) and managed to convince them to send you the manuals you needed (how?), you were out of luck until about 1989. That's almost 4 years.

Some already established game companies were able to before that, of course, this is why you see companies like Electronic Arts and programmers who were already in the business like Archer MacLean etc. releasing software early.

But trusty documentation companies like Zybex et al had seized on the gap and published documentation for the hardware and OS, but only parts of them. Still, not even your local Amiga dealer was likely to stock books, advertising for them was scarce, the local bookstore had 20 out of 5000 books that were "computer" books, basically only word of mouth and a bunch of phone calls would make you even aware that they existed. They cost 30-40 EUR a pop, so which one?

But the main problem was that C= apparently decided you weren't supposed to code assembler for Amiga, and they made the official hardware manual correct* yet hopelessly terse and abstract. They certainly didn't write it to make you learn to program the hardware, and they taught nothing in it.

Amiga was and is a great piece of hardware and the OS impressively set the standard for how OSes work even today. C= did fail in the respect that they thought once the product was out, everything would solve itself, and so were late and lax in supporting the platform once out (and late in upgrading performance of the platform, as we know). Now, that's so bl**dy easy for someone to say 25 years later with hindsight, right?

But it's also a factual weakness, and I think some of you are taking the even more easy way out to just dismiss early Amiga coders without taking this weakness into account. I'm defending them a little.
Photon is offline  
Page generated in 0.04063 seconds with 10 queries