View Single Post
Old 04 May 2021, 10:04   #30
mcgeezer
Registered User
 
Join Date: Oct 2017
Location: Sunderland, England
Posts: 2,702
Quote:
Originally Posted by Thomas Richter View Post
First of all, wrong approach. You let graphics/intuition do the allocation, and then use that memory for your graphics. I already said that there are alignment restrictions on bitmap memory that depend on the chip generation and the screen mode. Second, even if you allocated the memory yourself (hopefully with AllocBitMap()), you can provide this information to intuition to create a screen from you. But see above - this makes usually little sense.
I'd agree with you if I were writing an Amiga OS application, however I'm not so my approach is right given the memory constraints I am under. And no I don't use AllocBitmap because I don't have to, I can quite easily align the memory myself with my in game routines.

Quote:
Originally Posted by Thomas Richter View Post

No, the ansower is "your approach is wrong". Please rethink.
No, my approach is fine. You have several other programmers in here that are using an easier and viable alternative for the very problem I faced. I simply don't have the RAM available for me to bend and agree to "your approach is wrong" statement.

Quote:
Originally Posted by Thomas Richter View Post
That is not guaranteed. Please see the RKRMs. Some hardware (but probably not yours) does require regular interrupts. The RKRMs (and that is the important reference) state that you can only disable for a couple of milliseconds. I do not recall the precise number, but it's specified.
This comes back to the question I raised earlier in the year about game dev's having to support every bit of hardware plugged into a system, I've had this argument before, game dev's who want to have all hardware resources simply don't worry about it. If gamers don't like it, then they can unplug the hardware and run the game from a floppy.
mcgeezer is offline  
 
Page generated in 0.04411 seconds with 11 queries