Keep 3DLuVr online!
3DLuVr Logo
 From the Real World
 Digital Painting Series
 Featuring of...
 On the Bookshelf
 3ds max
 Softimage XSI
 Rhinoceros 3D
 Video Tutorials
FunZone menu
 I always wanted to be
 Talk to an employer
 Why Ask "Why"
TechZone menu
 Hardware Reviews
 Software Reviews
 Q&A, Tips & Tricks
UserZone menu
 The Artist Sites
 15 Min of Fame
 Request an Account
 Current Assignment
 Sponsors & Prizes
 Make a Submission
 Voting Booth
 Competition Rules
About menu
 Mission Statement
 Poll Archive
 How to IRC
Log in to be able to post comments to the news items, forum posts, and other facilities.
Not registered? Register!     Lost Password?
 Your New Year`s Resolution is...
Gain employment
Stop smoking/drinking/etc
Get back in shape
Find the meaning of life
Conquer the World
Absolutely nothing

    Poll Results
Want to leave us a comment about the site or in general? Click here to access the form.
TechZone Heading
The Creation of a Benchmark
Added on: Wed Nov 21 2001
Page: 1 2 4 5 6 7 8 9 10 

  1. The time is recorded via a stopwatch, manually, from the start of the animation, to its finish.
    In the case of the example, when the slider stops at frame 100.

  2. After recording an initial sampling, the data is written down, along with the total number of frames, for later conversion into FPS. The next benchmark scene is opened, and this process is repeated until all seventeen scenes have been tallied.

  3. Upon completing the first set of scenes, Max is closed, and then reopened, and the process is repeated using the SAME
    settings, but this time in a reverse order. (Start with benchmark 17 instead of benchmark 1)

  4. Step 6 is repeated an addition two times, for a total of four samplings per setting. (Example, four data sets for a Quadro DCC at 1600x1200x32 using OpenGL, for a total of 68 benchmarks PER data set)

  5. The Data is entered into a spreadsheet program, where averages of the four data repetitions are tabulated and frames per second are calculated.

  6. This method is repeated for a variety of variables, from resolution, to driver revision, API type, and even color depth and video card settings.

This method of data collecting is extremely exhausting. It takes a series of days of continuous recording and retesting to verify and turn numbers into fact. The advantage of this method is that the data is redundant and will eliminate most user related errors as well as any variables that may be caused by system or software glitches. This is due to the sheer bulk of data. For an average video card there will be a few hundred cycles of data before its finally tabulated and condensed into readable format.

The average testing pattern for a particular card is as follows. One data set for the card at 1600x1200 by 32 bit color, another for 16 bit color at the same resolution, then a data set at 1280x1024 by 32 bit color to compare resolution differences. Additional tests are run to compare the cards speed in Direct 3D, OpenGL, and any custom driver solutions.
Specific questions are addressed by comparing quality settings within these individual tests to further extrapolate data from the cards. All tests are run at 1600X1200x32 unless otherwise noted in the final reviews. This was found to be a medium resolution through an informal poll of Max users.

� 1997-2024 3DLuVrTM (Three Dee Lover)
Best viewed in 1024x768 or higher,
using any modern CSS compliant browser.