English Amiga Board


Go Back   English Amiga Board > Main > Retrogaming General Discussion

View Poll Results: Does computers evolve slower then before?
Yes 55 80.88%
No 10 14.71%
They evolve at pretty much the same pace, from the beginning 3 4.41%
Voters: 68. You may not vote on this poll

 
 
Thread Tools
Old 11 July 2021, 16:23   #81
Mrs Beanbag
Glastonbridge Software
 
Mrs Beanbag's Avatar
 
Join Date: Jan 2012
Location: Edinburgh/Scotland
Posts: 2,243
Quote:
Originally Posted by trydowave View Post
i agree. the changes are there but they arent mind blowing. i still have a 360 under my tv next to the ps4. played halo 4 the other day. looks surprisingly comparable graphic wise for a console that came out sixteen years ago!
this is correct, i think. Moore's Law is still holding up, computing power is still improving at the same rate it ever was, but once you get over a million triangles per second.. who's counting? The actual impact on computer game graphics is diminishing returns. Game engines will be doing a lot more things in realtime, in future, that are currently baked into the scene, but you would be an expert to notice that, or even know what it means. The big shift was when graphics went from 2D to 3D. Since then there have been noticable improvements from one generation to the next (look back at a PS1 game now that was amazing at the time) but I don't think there will be any big change like that again, like going from Sega Megadrive to Playstation.

It's not all about games graphics, however.. but no matter how faster our computers get, the websites will find a way to fill them up with Javascripts.
Mrs Beanbag is offline  
Old 11 July 2021, 18:54   #82
PortuguesePilot
Global Moderator
 
PortuguesePilot's Avatar
 
Join Date: May 2013
Location: Setúbal, Portugal
Posts: 609
Quote:
Originally Posted by Mrs Beanbag View Post
(...) but I don't think there will be any big change like that again, like going from Sega Megadrive to Playstation. (...).

I always joke that the next huge step will be when we stop using our eyes to play games, i.e., when we will use a "DirectLink" to our brain and "live" our games instead of playing them on a screen. There will be things as smells, moist, vibrations, etc... a veritable "à la Matrix" experience. (Elon Musk's Neuralink says what?!)

That or if hardware techs/mathematicians/engineers ever get to develop a truly analogue "curved 3D splice" method instead of the "angular 3D splice" that has been used since the dawn of 3D graphics. A curve-base system would spare millions in vectors and allow for the saved extra power to be applied elsewhere, which would also result in sort-of a quantum-leap in graphical quality. Probably never gonna happen due to limitations on how we implement the binary system into everything that computes. A true "curve-based 3D method" would have to transcend the digital binary system (instead of just 0 or 1, we would have the whole range of 0.0 all the way to 1.0, which is true analogical) and that would be an absolute revolution (which would, more than probably, also introduce a lot of problems, like breaking backward compatibility or continuity).

Seriously now, maybe ray tracing is the next big thing in graphical evolution. It's still in its infancy and the difference it introduces can amount to be quite big up ahead.
PortuguesePilot is offline  
Old 11 July 2021, 20:45   #83
Foebane
Banned
 
Join Date: Sep 2011
Location: Cardiff, UK
Age: 51
Posts: 2,871
I always figured the ultimate evolution of videogaming would be the Holodeck. I think we're absolutely nowhere near that, however.
Foebane is offline  
Old 11 July 2021, 21:29   #84
Mrs Beanbag
Glastonbridge Software
 
Mrs Beanbag's Avatar
 
Join Date: Jan 2012
Location: Edinburgh/Scotland
Posts: 2,243
Quote:
Originally Posted by Foebane View Post
I always figured the ultimate evolution of videogaming would be the Holodeck. I think we're absolutely nowhere near that, however.
i think that most people don't have enough space in their house, in this country anyway, most people barely have enough space for VR (although that makes me sick anyway). the direct-to-brain idea is compelling although i think i would be a bit scared of it..


Quantum computing will be the next big leap, technology-wise i think. Although i don't know if it will make a lot of difference from a user (or gamer) point of view, but it could potentially make a lot of algorithms asymptotically faster. I don't know! Quantum Computing is Weird!


The other up-and-coming technology is AI. Imagine games rendered by deep learning! It could be extremely strange. Basically the computer will just be imagining things for you. You would type in a scene. Maybe writing a game will become more like writing a script, you just describe the locations and the characters and the computer just imagines it for you.
Mrs Beanbag is offline  
Old 12 July 2021, 00:59   #85
alkis
Registered User
 
Join Date: Dec 2010
Location: Athens/Greece
Age: 53
Posts: 719
In my mind the next milestone on gaming will be when they'll reach today's top-notch cinematics and they can render in realtime. (e.g. [ Show youtube player ])
alkis is offline  
Old 12 July 2021, 01:03   #86
Dunny
Registered User
 
Dunny's Avatar
 
Join Date: Aug 2006
Location: Scunthorpe/United Kingdom
Posts: 1,990
Quote:
Originally Posted by grond View Post
How are wattages of the M1 and the core i7/i9?
We don't really measure that, we're interested in how well it performs from a processing POV.
Dunny is offline  
Old 12 July 2021, 02:06   #87
lmimmfn
Registered User
 
Join Date: May 2018
Location: Ireland
Posts: 674
Why are people comparing to Intel? They are far behind what AMD is doing with both chiplets and process reduction, theyre on 5nm with TSMC and intel are still stuck on 14nm++++++++,playing with 10nm but with low yields. TSMC are due to produce 3nm chips next year(AMD confirmed).
Apple are just tsking a different path copying mobile with strong and weak cores on the same package, performance wise AMD are where its at and the one to watch.
lmimmfn is offline  
Old 12 July 2021, 03:23   #88
QuikSanz
Registered User
 
QuikSanz's Avatar
 
Join Date: May 2021
Location: Los Angeles / USA
Posts: 135
Built a new win 10 machine not long ago, Ryzen7 2700X,core flies well ASUS MB w/64Gig of 3000 speed mem and a Radeon Vega 56. Fast as hell.

Still like the feel of my old Amiga machines.

Chris
QuikSanz is offline  
Old 12 July 2021, 08:59   #89
grond
Registered User
 
Join Date: Jun 2015
Location: Germany
Posts: 1,918
Quote:
Originally Posted by Dunny View Post
We don't really measure that, we're interested in how well it performs from a processing POV.
Well, actually in microelectronics as a science the more important measure is instructions per Watt because IPC (instructions per cycle) and MIPS (millions instructions per second) are pretty much meaningless in isolation. If the M1 is on par with a core i7 consuming ten times as much power, it is easy to imagine what it will do if it is clocked higher and accompanied by 32 MB of 3rd level cache.

But I agree it doesn't really matter much because Apple keeps its processors to themselves. I don't care how pretty the girls are at somebody else's private party...
grond is offline  
Old 12 July 2021, 09:16   #90
Foebane
Banned
 
Join Date: Sep 2011
Location: Cardiff, UK
Age: 51
Posts: 2,871
I thought AMD was cheaper than Intel, but it seems my Core i3-8100 is cheaper than any Ryzen 5, which is what I would consider.

@Grond

There goes this overclocking crap again. Why do people do that? It seems to cause no speed increases but damages hardware.

I tried overclocking my Orchid Righteous 3D card back in the 1990s and it immediately started glitching. I put the clock speed back down but it still glitched every now and then, when it DIDN'T before.
Foebane is offline  
Old 12 July 2021, 09:34   #91
roondar
Registered User
 
Join Date: Jul 2015
Location: The Netherlands
Posts: 3,411
Quote:
Originally Posted by grond View Post
Well, actually in microelectronics as a science the more important measure is instructions per Watt because IPC (instructions per cycle) and MIPS (millions instructions per second) are pretty much meaningless in isolation. If the M1 is on par with a core i7 consuming ten times as much power, it is easy to imagine what it will do if it is clocked higher and accompanied by 32 MB of 3rd level cache.
Full disclosure: my interest in the M1 is not because I'm an Apple fan (though I do own an iPhone and a 10 year old Mac Mini), but rather because I'm interested in new developments in the CPU markets. With that said:

The interesting thing about the M1 is precisely the performance per watt statistics. It uses very little power (around 40 watts under full load), yet has single thread performance about on-par with the fastest AMD has to offer at this time (A CPU's that uses over 140 watts). It's multithreading performance is a quite bit less impressive, it's comparable to a 5600x, which is a midrange AMD part that uses about 65 watts.

However, it should be noted that several experts have pointed out that the current design of the M1 is partially this efficient because of the process that's been used. And apparently this process (designed for low power CPU's) is hard to scale up, the current clock speeds are roughly as good as you're going to get and the transistor count it close to the maximum economic size for the process.

With this in mind, it'll be interesting to see what Apple does next with the silicon, which will also tell us if these caveats are actually true.
roondar is offline  
Old 12 July 2021, 11:10   #92
grond
Registered User
 
Join Date: Jun 2015
Location: Germany
Posts: 1,918
Quote:
Originally Posted by Foebane View Post
There goes this overclocking crap again. Why do people do that? It seems to cause no speed increases but damages hardware.
This hasn't got anything to do with overclocking. It's a fact that power consumption of a CMOS integrated circuit grows with clock rate even if the clock rate remains within the hardware specification.
grond is offline  
Old 12 July 2021, 12:25   #93
rothers
Registered User
 
Join Date: Apr 2018
Location: UK
Posts: 487
Quote:
Originally Posted by roondar View Post
However, it should be noted that several experts have pointed out that the current design of the M1 is partially this efficient because of the process that's been used. And apparently this process (designed for low power CPU's) is hard to scale up, the current clock speeds are roughly as good as you're going to get and the transistor count it close to the maximum economic size for the process.

With this in mind, it'll be interesting to see what Apple does next with the silicon, which will also tell us if these caveats are actually true.
Well we're getting the M1X in October, it will be the first Mac I've ever bought new, that's how excited I am about this architecture. M2 next year.

Have you seen the videos which point to Apple playing a very clever game with the silicon where it's suspected they can keep adding cores even from bad binned parts?

We're at the start of a fascinating route by Apple, rumours are they could make a 256core version of the M series pretty easily, it might actually already exist.

Imagine having 256cores at this low power level? This is exciting.

Imagine that power in a desktop or laptop? This is mad. They are coming for Intel.

I do also like what AMD are doing but it's just all last gen, x86 is dead. It's like car companies still making ICE, it's cool to see them pushing oil to the limits but EVs are the future.
rothers is offline  
Old 12 July 2021, 12:32   #94
grond
Registered User
 
Join Date: Jun 2015
Location: Germany
Posts: 1,918
Quote:
Originally Posted by rothers View Post
Imagine having 256cores at this low power level? This is exciting.

Imagine that power in a desktop or laptop?
I actually have six hard / twelve logical cores in my desktop computer and don't have much use for more than a couple of them. When it comes to gaming, I have a graphics card with a gazillion parallel cores.


Quote:
They are coming for Intel.
No, they are precisely NOT coming for Intel because Apple does not sell their CPUs to OEMs. And that's why I respect the engineering that went into the M-chip but ain't in the least excited about it.

But hey, this adds to the flawed Apple/Commodore analogy, Commodore didn't sell 6502/6010s to OEMs either (well, at least not many).
grond is offline  
Old 12 July 2021, 13:11   #95
freehand
Registered User
 
Join Date: Mar 2010
Location: wisbech
Posts: 276
Quote:
Originally Posted by Mrs Beanbag View Post

Quantum computing will be the next big leap, technology-wise i think. Although i don't know if it will make a lot of difference from a user (or gamer) point of view, but it could potentially make a lot of algorithms asymptotically faster. I don't know! Quantum Computing is Weird!

.
If or when this is happens successfully the world will change for ever.
freehand is offline  
Old 12 July 2021, 14:02   #96
roondar
Registered User
 
Join Date: Jul 2015
Location: The Netherlands
Posts: 3,411
Note, my post might seem a bit critical. Please understand this is not because I don't think the M1 is cool (because it is cool to have a potential competitor to X64), but because some of the things said about the M1 CPU seem to me to be a tad unrealistic. Apple still has to play within the limits of available processes and physics
Quote:
Originally Posted by rothers View Post
Have you seen the videos which point to Apple playing a very clever game with the silicon where it's suspected they can keep adding cores even from bad binned parts?
Not yet, but that's an interesting idea. Will be interesting to see how they plan to do that, as their CPU's are produced by an external partner which would then need to do that for them.
Quote:
We're at the start of a fascinating route by Apple, rumours are they could make a 256core version of the M series pretty easily, it might actually already exist.
That honestly seems extremely unlikely to me. The current 8 core part already takes up 16 billion transistors, making it one of the biggest consumer CPU dies this CPU generation as is. Upgrading it to 256 cores would mean adding an enormous amount of additional transistors. Considering top of the line server CPU's cap out at less than 40 billion (and need much more power) this is just not realistic unless Apple pretty radically changes the process.

Which, again, would be limited by what their fab partner can actually supply. TMSC are quite good at this, but they're not magic.
Quote:
Imagine having 256cores at this low power level? This is exciting.
That will definitely not happen unless they manage to change the process radically, even if I'm wrong and they do manage to get a 256 core die. Having more cores at the same number of transistors per core & using the same production process always means using more power (linearly so, in fact). Sure, you can power down/underclock unused cores, but that still won't solve the issue of what the power draw will be when you actually want to use all 256 cores. This is simply a consequence of physics.
Quote:
Imagine that power in a desktop or laptop? This is mad. They are coming for Intel.

I do also like what AMD are doing but it's just all last gen, x86 is dead. It's like car companies still making ICE, it's cool to see them pushing oil to the limits but EVs are the future.
Intel (and AFAIK even AMD) are also moving to the BIG.Little idea, so don't count them out just yet

Last edited by roondar; 12 July 2021 at 14:22.
roondar is offline  
Old 12 July 2021, 14:32   #97
d4rk3lf
Registered User
 
d4rk3lf's Avatar
 
Join Date: Jul 2015
Location: Novi Sad, Serbia
Posts: 1,646
I think, for computer games to be much better, the graphics won't change much, no matter what technology jump is made.
They just need to stop treating players as morons, and look upon games period before 15 years, and back.

This guy explained very well, what's wrong with modern games:
[ Show youtube player ]
d4rk3lf is offline  
Old 12 July 2021, 17:46   #98
funK
Registered User
 
Join Date: Sep 2011
Location: ItAlien
Posts: 170
Quote:
Originally Posted by trydowave View Post
i agree. the changes are there but they arent mind blowing. i still have a 360 under my tv next to the ps4. played halo 4 the other day. looks surprisingly comparable graphic wise for a console that came out sixteen years ago!
Sorry, but this completely bullshit: unless you're playing your PS4 on a really bad 540i TV from 16 years ago, too, the graphics between the two is not even close to comparable, see GTAV as an example:
Quote:

Source:
[ Show youtube player ]
But as other have said, generational graphic leaps aren't what we used to anymore, in fact Sony when designing the PS5 was smart enough to focus on aspects other than graphics: haptic feedback and adaptive triggers on controllers, loading times comparable to the cartridges and 3D audio when all correctly used by developers (e.g. Returnal, Ratchet & Clank, Demon's Souls) are game-changers and still bring that next-get feel, so much that going back to the PS4 honestly feels like a chore now.
funK is offline  
Old 12 July 2021, 18:45   #99
mrsebe
Registered User
 
Join Date: Jul 2021
Location: Poland
Posts: 10
Replacing a normal hard drive with a SSD provided a big speed boost. But after that SSD drives were evolving from MLC through TLC to QLC. Each new generation slowed them down to increase density. QLC slows down considerably after exceeding buffer capacity. Questionable practices like quietly replacing flash chips with a slower ones without changing the model name.
mrsebe is offline  
Old 12 July 2021, 22:04   #100
Foebane
Banned
 
Join Date: Sep 2011
Location: Cardiff, UK
Age: 51
Posts: 2,871
Quote:
Originally Posted by grond View Post
This hasn't got anything to do with overclocking. It's a fact that power consumption of a CMOS integrated circuit grows with clock rate even if the clock rate remains within the hardware specification.
Sorry, I was just reminded of that practice.
Foebane is offline  
 


Currently Active Users Viewing This Thread: 1 (0 members and 1 guests)
 
Thread Tools

Similar Threads
Thread Thread Starter Forum Replies Last Post
4.4.0 noticeably slower than 4.2.0 Foebane support.WinUAE 37 12 May 2021 21:33
PPC Slower mritter0 support.WinUAE 10 27 October 2015 22:50
my prog gets slower and slower AGS Coders. System 2 19 March 2015 22:27
Why is NTSC mode so much slower than PAL? mr_a500 support.FS-UAE 3 07 December 2012 20:28
Emuchina slower than slow andreas Amiga websites reviews 7 04 November 2002 15:36

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +2. The time now is 12:19.

Top

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.
Page generated in 0.10714 seconds with 17 queries