[Clusterusers] glibc version on fly, cluster

Bassam Kurdali bassam at urchn.org
Mon May 22 17:49:26 EDT 2017


Ok, so I think the only possible choice, short of an upgrade is choice
1 as:
chroot would likely crash and burn as you have to:
 a- produce an entire system in it
 b- glibc is tied to the kernel running so it would likely crash

docker needs a newer kernel (3.something high, we have 2.6)

I've failed miserably to build blender on fly even before bf bumped the
glibc requirement - lets see if I can succeed. I'll need to install
stuff like cmake (and who knows what else, so I'll need help with
access)
cheers,
Bassam

On Mon, 2017-05-22 at 11:10 -0400, Wm. Josiah Erikson wrote:
> Hm. So...
> 
> Even CentOS 7.3 (the latest) won't get you up to that version of
> glibc,
> and updating glibc on a system is, well, not really feasible, since
> basically everything links against it.
> 
> So I think we have four choices:
> 
>     1. See if you can build a statically-linked version of blender
> that
> is portable to the cluster. It's not clear to me from the
> instructions
> on static linking whether you can statically link glibc or not
> 
>     2. Build a chrooted environment on the cluster to build and run
> blender in
> 
>     3. Update the Macs and use them only
> 
>     4. Use docker or something to deploy minimal blender servers. I
> haven't messed with docker yet and don't know how feasible this would
> be.
> 
>     -Josiah
> 
> 
> 
> On 5/15/17 2:28 PM, Bassam Kurdali wrote:
> > Thanks Josiah! Glad that it's not just me and my rendering ;)
> > Crossing fingers and hoping it won't be a big disruptive change
> > (from
> > my perspective, if we can get at or beyond glibc 2.19 I'll be happy
> > as
> > a clam
> > On Fri, 2017-05-12 at 22:26 -0400, Wm. Josiah Erikson wrote:
> > > It looks like there are ways to run newer versions of CentOS
> > > underneath
> > > ROCKS 6.2... will research more and attempt to update next week
> > > if
> > > it's
> > > not too disruptive, later this summer if it is. You're hardly the
> > > only
> > > person noticing how out-of-date everything is on the cluster. Of
> > > course
> > > I also worry that we've been pwned for years and just don't know
> > > it.
> > > I
> > > see no evidence of that... but that means not much. It's past
> > > time
> > > for a
> > > major overhaul.
> > > 
> > >     -Josiah
> > > 
> > > 
> > > 
> > > On 5/12/17 10:08 PM, Wm. Josiah Erikson wrote:
> > > > Well really the issue is that we're still running ROCKS, and
> > > > we're
> > > > a
> > > > version behind... but ROCKS seems to have fallen behind and a
> > > > new
> > > > release hasn't come out in almost two years. I think it's time
> > > > to
> > > > move
> > > > to something else and probably entirely rebuild the cluster.
> > > > The
> > > > latest
> > > > version of ROCKS is based on CentOS 6.6, which is already out
> > > > of
> > > > support
> > > > and doesn't have a new glibc either.
> > > > 
> > > > I will do some research, ask my HPC colleagues what they're
> > > > running
> > > > these days, and ask the ROCKS list what's up - I think they
> > > > were
> > > > NFS-funded and, well...
> > > > 
> > > > That said, we should figure out an interim solution. I shall
> > > > think
> > > > about
> > > > this more next week.
> > > > 
> > > >     -Josiah
> > > > 
> > > > 
> > > > 
> > > > On 5/12/17 9:08 PM, Bassam Kurdali wrote:
> > > > > hi folks,
> > > > > we're currently at glibc 2.12 which is fairly oldish -
> > > > > there's a
> > > > > shiny
> > > > > new blender with with pretty impressive rendering
> > > > > improvements
> > > > > (new
> > > > > shaders and massive speed increases) - but as of the *last*
> > > > > blender
> > > > > version, they dropped glibc older than 2.19
> > > > > 
> > > > > 
> > > > > This is mainly a question for josiah, but can we do something
> > > > > like have
> > > > > two glibc versions on the system? or some kind of fancy
> > > > > chroot/local
> > > > > environment? failing that, I might need some help building
> > > > > for
> > > > > fly
> > > > > (installing dependencies, cmake, etc.)
> > > > > 
> > > > > cheers,
> > > > > Bassam
> > > > > _______________________________________________
> > > > > Clusterusers mailing list
> > > > > Clusterusers at lists.hampshire.edu
> > > > > https://lists.hampshire.edu/mailman/listinfo/clusterusers
> > 
> > _______________________________________________
> > Clusterusers mailing list
> > Clusterusers at lists.hampshire.edu
> > https://lists.hampshire.edu/mailman/listinfo/clusterusers
> 
> 


More information about the Clusterusers mailing list