Page 1 of 2

Which advance faster: Software or Hardware???

Posted: Fri Jun 03, 2011 5:45 am
by soulless
Hi,

I need to make a survey about it to 15 or so people, it is something (what people think about it) to discuss in class at university.

I don't need the correct answer, just what you think about it and if you want, why?

Cheers!!!

Posted: Fri Jun 03, 2011 7:39 am
by Radikalizm
Moore's law anyone? That's a pretty vague question right there...
Software is an enormously huge concept, and so is hardware, you can't just say that one is advancing faster than the other one unless you look at the in a very narrow way

Also, don't you think that the two are kind of tied to each other?

Posted: Fri Jun 03, 2011 12:57 pm
by Xaryl
Consider defining a measure for advancement, otherwise the question is as meaningful as "which one is getting crispier faster?"

Posted: Fri Jun 03, 2011 12:59 pm
by Radikalizm
Xaryl wrote:Consider defining a measure for advancement, otherwise the question is as meaningful as "which one is getting crispier faster?"
QFT :D Crispiness would be an awesome way to measure software advancement though

Posted: Fri Jun 03, 2011 2:23 pm
by ChaiRuiPeng
Xaryl wrote:Consider defining a measure for advancement, otherwise the question is as meaningful as "which one is getting crispier faster?"
[didn't happen]
well one time my video card caught fire, during the process the ram sticks got a nice layer of black electronics smoke, but they weren't crispy :D [/didn't happen]

Posted: Fri Jun 03, 2011 4:34 pm
by ACE247
Well, I do know for certain that most of the graphics rendering techniques we use today were already developed during the 80's and 90's(some even 60's) except back then it ran on house size hardware.

Maybe just check up on the history of some of the techniques you use.
In 1971, Phong Bui-Toung at Utah developed a new shading method that was an improvement on the old Gouraud-shading. Phong’s shading method accurately renders the colors on a mesh surface and produces accurate reflective surface shading, but both Gouraud and Phong’s shadings have difficulties in smoothing over the outline edge of a 3D object.

In 1976, James Blinn of Utah developed a new technique called Bump mapping. Bump mapping can simulate the roughness of a surface by interpreting a grey scale map.

Posted: Fri Jun 03, 2011 4:38 pm
by Radikalizm
That's kind of an unwritten rule in graphics technology, once a technique or algorithm is developed for non-real-time use, it will take about 20 years before it becomes feasible to use in real-time applications

Even the concept of deferred rendering which now is all the rage in a lot of rendering engines was conceived in 1988

Posted: Fri Jun 03, 2011 5:59 pm
by ACE247
Now what do we call it? :D
I can't come up with something fitting.

Ps I think Hardware and software advance at the same rate, hardware is just quite a bit behind. (as if software existed before hardware...)

Posted: Fri Jun 03, 2011 6:17 pm
by hendu
Hardware tends to advance, while software tends to regress. This is one of Murphy's corollaries too, software will fill all available cpu cycles.

Posted: Fri Jun 03, 2011 6:42 pm
by Radikalizm
hendu wrote:Hardware tends to advance, while software tends to regress. This is one of Murphy's corollaries too, software will fill all available cpu cycles.
I wouldn't call using all available cpu cycles regression, I'd call that using all available resources

If you would look at a single set of hardware, then yes, software would get slower and slower because more complex software requires more resources, but then again if only a single set of hardware would be available and no progression would be made in that area, software probably would not get slower since developers could keep these hardware limitations in mind

Posted: Fri Jun 03, 2011 6:56 pm
by devsh
definitely hardware, look at a case study.... Windows

exponential regression each year, slower and slower and more buggy... doing the exact same thing takes up more resources with each version.

Posted: Fri Jun 03, 2011 7:13 pm
by Radikalizm
devsh wrote:definitely hardware, look at a case study.... Windows

exponential regression each year, slower and slower and more buggy... doing the exact same thing takes up more resources with each version.
We all know that you are rather biased :D

Windows 7 is a great improvement over Vista and XP alike, just like XP was a great improvement over Me, and I can go on like this for a while

If it was true that windows was regressing each year we would still be running a Windows 3.x environment

Windows has had and still has its issues, but plain bashing because it's cool to bash windows is rather stupid

Posted: Fri Jun 03, 2011 7:47 pm
by ACE247
Actually I say Hardware and Software development rate is just simply tied together, if the Hardware advances software will exploit or over-exploit it. If the Software advances new Hardware will meet or surpass its requirements.

Neither is ahead all or most of the time, once one pulls out ahead the other catches up and overtakes.

Its Competition! :D

Posted: Fri Jun 03, 2011 11:30 pm
by Mel
Software is designed to fill all the CPU cycles, but it is a fact that hardware advances much faster, and the proof is the variety of solutions to the same problem you can find out there and that many of them won't use eficiently the hardware available, because of lack of knowledge.

It is relatively easy to think of a new hardware, putting it into work is the hard part, and thus, Software has to be always behind the hardware development.

Posted: Sat Jun 04, 2011 2:35 am
by soulless
Sorry I didn't comment anything in a whole day, I had to attend a medical family matter.

My question refers (or wants to) to the fact/rate of improvement and/or the appearance of new things/new technology. Sorry if not clear enough or too vague, my english has limitations :roll:

PS: I won't comment about the question itself or your conversation about it, because as me being the pollster, my opinion must not count or influence the conversation.

Thanks all of you for sharing your opinions!!!