Hi Adrian, This is strange! I have seen this happening in the two following situations: - disk quota exceeded when converting from Hbook to Root. But in this case you should get plenty of warnings and error messages. - AFS cache corruption. Could you send me (not to the list) both the original hbook file and the root file generating the error messages? Rene Brun Adrian John Bevan wrote: > > Hi, > > I am using root version 2.25/03 on a sun running Solaris 2.6. I've just > started to play with some hbook files produced; converting them to root > using h2root and want to do some analysis with these. > > The hbook file is 6 Mb and the root one is 1.5M: > 5.7M b0pi0pi0-382459-382459.hbook > 1.5M b0pi0pi0-382459-382459.root > > I got a very disconcerting warning from root when after opening the > file file and trying to load it into memory (I wanted to look at an > ntuple): > > root [1] Error in <TBuffer::ReadObject>: object tag too large, I/O > buffer corrupted > Error in <TBuffer::ReadObject>: object tag too large, I/O buffer > corrupted > Error in <TBuffer::ReadObject>: object tag too large, I/O buffer > corrupted > > *** Break *** segmentation violation > > Using another SUN system (Solaris 2.6) root 2.23 will open other root > files (from hbook files made by the same analysis code). > > Re-building the first file worked (on the same system using the > same h2root binary); root gave no errors when on opening the file I tried > to display the content of the ntuple (this is what went wrong > before). Has anyone encountered similar errors on a regular basis or was > this just some freak one-off that will be unrepeatable or very > infrequent? All other times that I have used h2root have yielded no > obvious problems whatsoever. > > I just thought I'd ask ... as I will have quite a lot of hbook file > conversions to do in the coming month. > > Thanks > > Adrian
This archive was generated by hypermail 2b29 : Tue Jan 02 2001 - 11:50:38 MET