Originally Posted by guybrush
Did the programmers got worst with the increase of speed?
Well... i see this everyday in my work - there are few lines of products - two of them use modern SoC and one a bit older (i mean 400MHz so anyway faster than fastest Amiga).
All those 3 products use different software technologies (i mean API's, different Linux versions etc) but they are very similar in terms of customer target.
! platform use a bit outdated (400MHz) STM SoC, second platform fancy crap from Intel (called Groveland) and third modern SoC from Broadcom - (some people already knows what kind of products im describing).
So imagine 3 different HW and 3 different SW teams, imagine also now that slowest HW have best feel and look within same GUI (all 3 products looks same from customer perspective), side to this slowest HW have very limited RAM (less than 256MB where other 2 SoC use 1GB of RAM)...
So if you asking me if this mean that developers get worst i would answer definitely yes - i blame lack of experience, lack of knowledge as i believe you may be even PhD on some area and on other area you will fail - Amiga have this luck to be designed by people with passion but also with knowledge - personally i believe that with modern HW people just relay on compilers instead creating fast and optimized code and i can even understand this - they just earning money and they think about going fast as possible to home... No passion...
I have lot strange discussion such as difference between debug and regular build - most of developers believe that debug is same as non debug and activating debug doesn't disturb anything...
Sometimes i giveup... and thats why we cant have decent computer with decent OS (with whole respect to Linux i think this is not good universal OS that can be used from large supercomputer clusters to small embedded system - it simple don't downscale well...).