View Single Post
Old 26 July 2015, 17:55   #34
Registered User
Join Date: Jun 2010
Location: NL/PL
Posts: 1,429
Originally Posted by Mrs Beanbag View Post
i think they do care... they just don't know any better. because yes, they have no reference. the slowest thing they ever coded for ran at 1GHz and had 256Mb of RAM so the idea that that is a lot is completely lost on them, if you said you could get their code to run on a 10MHz CPU with only 256Kb of RAM they'd think you were being funny.
Nope - trust me - they have no background (as i've talk with them - they have no clue for example about video or audio), don't understand implications for doing this or this (why trying to change content of screen more than framerate of video? - this is simple thing - check VSync before, don't go over VSync), side to this they have no knowledge about standards.
I believe that nowadays most of code is created in different way and management can be blamed (unrealistic time planning) only partially.
Worse - we as a customers accepting this as unavoidable cost of progress... there is general acceptance for low quality code... soon this will be low quality of everything - services, health care etc...

Originally Posted by Mrs Beanbag View Post
i'm not talking about code complexity (although sometimes that is also a problem), but algorithmic complexity. This is taught about in computer science courses and in any good book on the subject but it somehow doesn't seem to sink in.

For example if you write a bubble sort, that will be fine if you are sorting maybe 10 or so elements, but if your dataset grows to a million elements it's going to be slow, no matter how much you optimise it. Because it will still be a bubble sort, and a bubble sort has complexity of O(n^2), in other words when the size of the problem doubles it takes four times as long to run. So if your memory size doubles, and your CPU speed doubles, your program is going to be half as fast. This is what happens in the real world. And i have seen people using O(n^2) algorithms, in fact one time i refactored what was i think even an O(n^3) algorithm (i didn't do a proper analysis, it was bad anyway). In the end i got it down to linear time complexity but people were too sceptical that it would actually produce the same result.
I agree but this is area where selecting algorithm better suited to do work or changing architecture that speedup everything must be done - you can't ignore physics and believe that compiler in some magical way will clean this mess.

Originally Posted by Mrs Beanbag View Post
That is because of 5-inch colour touchscreens.
Nope... there is more than this... but this is another topic - still technology suffer from some limitations that make some things impractical...
pandy71 is offline  
Page generated in 0.06436 seconds with 9 queries