[svlug] Maximum memory (debian 3.1)
echerlin at gmail.com
Wed Mar 21 13:44:08 PST 2007
On 3/20/07, Greg Lindahl <greg.lindahl at qlogic.com> wrote:
> On Tue, Mar 20, 2007 at 03:25:34PM -0700, Rick Moen wrote:
> > Point taken. We can hope that the situation will improve, as toolchains
> > are optimised for the newer binary environment.
> Neither of the examples I gave will improve. Our compiler is used to
> build itself, and we're a much better 64-bit compiler than 32-bit
> compiler (in 64 bit mode, we're #1 on Opteron, and tied with Intel for
> floating point on Woodcrest), but it uses a lot of pointers, and
> making them twice as large is a big loss.
How big a loss, vs. how much gain in performance? Even without seeing
the numbers, I'm sure that I would take the hit on memory for any
system that I would actually use for compute-intensive work. At
$100/G, what's the issue?
Back in 1990 I met some people from Morgan-Stanley who were
complaining that Sun couldn't build them workstations with more than
2G of memory. APL programmers doing heavy financial lifting, and C
programmers who had to convert some of the run-once APL code (created
to evaluate a deal in under two hours, where C would be hopeless) into
Nested Arrays APLs are implemented (in C, typically) using arrays of
pointers to arrays. I suppose LISP would take a bigger hit (depending
on how many cons cells get optimized away), but in AI research, even
doubling total memory usage would be a small price to pay for having
the memory space at all. YMMV greatly.
> -- greg
Earth Treasury: End Poverty at a Profit
WIRE AFRICA http//www.wireafrica.org/
More information about the svlug