ENSDF++ 1.1
An easy, fast and simple way to run querys towards the ENSDF database, written in C++.
Public Member Functions | Protected Member Functions | Protected Attributes | Private Member Functions | Private Attributes
Dataset Class Reference

This class corresponds to a specific dataset in the ENSDF database. More...

#include <Dataset.h>

List of all members.

Public Member Functions

 Dataset (list< string > textToInput)
 Constructor, constructs the dataset from a list of strings consisting of the dataset information in the ENSDF file.
 ~Dataset ()
 Destructor, deletes all records in the database.
list< Record * > getRecords () const
list< BetaRecordWrapper * > getBetaRecords () const
list< IdentificationRecord * > getIdentificationRecords () const
list< HistoryRecord * > getHistoryRecords () const
list< CrossReferenceRecord * > getCrossReferenceRecords () const
list< CommentRecord * > getCommentRecords () const
list< ParentRecord * > getParentRecords () const
list< NormalizationRecord * > getNormalizationRecords () const
list
< ProductionNormalizationRecord * > 
getProductionNormalizationRecords () const
list< LevelRecord * > getLevelRecords () const
list< BetaMinusRecord * > getBetaMinusRecords () const
list< BetaPlusRecord * > getBetaPlusRecords () const
list< AlphaRecord * > getAlphaRecords () const
list< DelayedParticleRecord * > getDelayedParticleRecords () const
list< GammaRecord * > getGammaRecords () const
list< ReferenceRecord * > getReferenceRecords () const

Protected Member Functions

void generateRecordLists ()
 Generates the list myRecords from all records.
void initRecordStack ()
 Initializes the private member myRecordStack.
void initRecords ()
 Generates the list myRecords from all Record's.
void initBetaRecords ()
 Generates the list myBetaRecords from all BetaPlusRecord's and BetaMinusRecord's.

Protected Attributes

list< Record * > myRecords
 All Record's in the dataset.
list< BetaRecordWrapper * > myBetaRecords
 All BetaPlusRecord's and BetaMinusRecord's in the Dataset.
list< IdentificationRecord * > myIdentificationRecords
 The IdentificationRecord's in this Dataset.
list< HistoryRecord * > myHistoryRecords
 The HistoryRecord's in this Dataset.
list< CrossReferenceRecord * > myCrossReferenceRecords
 The CrossReferenceRecord's in this Dataset.
list< CommentRecord * > myCommentRecords
 The CommentRecord's in this Dataset.
list< ParentRecord * > myParentRecords
 The ParentRecord's in this Dataset.
list< NormalizationRecord * > myNormalizationRecords
 The NormalizationRecord's in this Dataset.
list
< ProductionNormalizationRecord * > 
myProductionNormalizationRecords
 The ProductionNormalizationRecord's in this Dataset.
list< LevelRecord * > myLevelRecords
 The LevelRecord's in this Dataset.
list< BetaMinusRecord * > myBetaMinusRecords
 The BetaMinusRecord's in this Dataset.
list< BetaPlusRecord * > myBetaPlusRecords
 The BetaPlusRecord's in this Dataset.
list< AlphaRecord * > myAlphaRecords
 The AlphaRecord's in this Dataset.
list< DelayedParticleRecord * > myDelayedParticleRecords
 The DelayedParticleRecord's in this Dataset.
list< GammaRecord * > myGammaRecords
 the GammaRecord's in this Dataset.
list< QValueRecord * > myQValueRecords
 The QValueRecord's in this Dataset.
list< ReferenceRecord * > myReferenceRecords
 The ReferenceRecord's in this Dataset.

Private Member Functions

void flushSpecific (RecordType toFlush)
 Flushes a list of accumulated records from myRecordStack. Used upon initializing only.
void flushAllStacksExcept (int nbrOfRecords,...)
 Flushes all accumulated lists of records except a number nbrOfRecords of records, which are specified by nbrOfRecords arguments, each consisting of the specific RecordType. Call with argument 0 to flush all.

Private Attributes

map< RecordType, list< string > > myRecordStack
 A map mapping RecordTypes together with accumulated records. This is used in order to accumulate continuation records and create one single record for them. Flushing this means taking the list and creating a record from it, and then emptying the list.

Detailed Description

This class corresponds to a specific dataset in the ENSDF database.

It contains all the records in the dataset.

Author:
Rikard Lundmark

Definition at line 66 of file Dataset.h.


Constructor & Destructor Documentation

Dataset::Dataset ( list< string >  textToInput)

Constructor, constructs the dataset from a list of strings consisting of the dataset information in the ENSDF file.

Parameters:
textToInputAll the records in the dataset, one line per record.

Definition at line 199 of file Dataset.cpp.

References AlphaRecord_, BetaMinusRecord_, BetaPlusRecord_, CommentRecord_, CrossReferenceRecord_, DelayedParticleRecord_, flushAllStacksExcept(), flushSpecific(), GammaRecord_, HistoryRecord_, IdentificationRecord_, initBetaRecords(), initRecords(), initRecordStack(), LevelRecord_, myRecordStack, NormalizationRecord_, ParentRecord_, ProductionNormalizationRecord_, and QValueRecord_.

{
  initRecordStack();
  for(list<string>::iterator it = textToInput.begin(); it!=textToInput.end();it++)
    {
      string local = *it;
      if(local[6]==' ' && local[7]==' ' && local[8]==' ') //IdentificationRecord
        {
          if(local[5]==' ')
            {
              flushAllStacksExcept(0);
            }
          else
            {
              if(myRecordStack[IdentificationRecord_].empty())
                throw DataFileException("No IdentificationRecord to continue: \n" + local);
              flushAllStacksExcept(1,IdentificationRecord_);
            }
          myRecordStack[IdentificationRecord_].push_back(local);
        }     
      else if(local[6]==' ' && local[7]=='H' && local[8]==' ') //HistoryRecord
        {
          if(local[5]==' ')
            {
              flushAllStacksExcept(0);
            }
          else
            {
              if(myRecordStack[HistoryRecord_].empty())
                throw DataFileException("No HistoryRecord to continue: \n" + local);
            }
          myRecordStack[HistoryRecord_].push_back(local);
        }
      else if(local[6]== ' ' && local[7]=='Q' && local[8]==' ') //QValueRecord
        {
          if(local[5]==' ')
            {
              flushAllStacksExcept(0);
            }
          else
            {
              if(myRecordStack[QValueRecord_].empty())
                throw DataFileException("No QValueRecord to continue: \n" + local);
              flushAllStacksExcept(1,QValueRecord_);
            }
          myRecordStack[QValueRecord_].push_back(local);
        }
      else if(local[6]==' ' && local[7]=='X') //CrossReferenceRecord
        {
          if(local[5]==' ')
            {
              flushAllStacksExcept(0);
            }
          else
            {
              if(myRecordStack[CrossReferenceRecord_].empty())
                throw DataFileException("No CrossReferenceRecord to continue: \n" + local);
            }
          myRecordStack[CrossReferenceRecord_].push_back(local);
        }
      else if((local[6]=='C' || local[6]=='D' || local[6]=='T' || local[6]=='c' || local[6]=='d' || local[6]=='t')) //Comment record
        {
          if(local[5]==' ')
            {
              flushSpecific(CommentRecord_);
            }
          else
            {
              if(myRecordStack[CommentRecord_].empty())
                throw DataFileException("No CommentRecord to continue: \n" + local);
            }
          myRecordStack[CommentRecord_].push_back(local);
        }
      else if(local[6]==' ' && local[7]=='P') //Parent record
        {
          if(local[5]==' ')
            {
              flushAllStacksExcept(0);
            }
          else
            {
              if(myRecordStack[ParentRecord_].empty())
                throw DataFileException("No ParentRecord to continue: \n" + local);
              flushAllStacksExcept(1,ParentRecord_);
            }
          myRecordStack[ParentRecord_].push_back(local);
        }
      else if(local[6]==' ' && local[7]=='N') //NormalizationRecord
        {
          if(local[5]==' ')
            {
              flushAllStacksExcept(0);
            }
          else
            {
              if(myRecordStack[NormalizationRecord_].empty())
                throw DataFileException("No NormalizationRecord to continue: \n" + local);
              flushAllStacksExcept(1,NormalizationRecord_);
            }
          myRecordStack[NormalizationRecord_].push_back(local);
        }
      else if(local[6]=='P' && local[7]=='N' && local[8]== ' ') //ProductionNormalizationRecord
        {
          if(local[5]==' ')
            {
              flushAllStacksExcept(0);
            }
          else
            {
              if(myRecordStack[ProductionNormalizationRecord_].empty())
                throw DataFileException("No ProductionNormalizationRecord to continue: \n" + local);
              flushAllStacksExcept(1,ProductionNormalizationRecord_);
            }
          myRecordStack[ProductionNormalizationRecord_].push_back(local);
        }
      else if(local[6]==' ' && local[7]=='L' && local[8]== ' ') //LevelRecord
        {
          if(local[5]==' ')
            {
              flushAllStacksExcept(0);
            }
          else
            {
              if(myRecordStack[LevelRecord_].empty())
                throw DataFileException("No LevelRecord to continue: \n" + local);
              flushAllStacksExcept(1,LevelRecord_);
            }
          myRecordStack[LevelRecord_].push_back(local);
        }
      else if(local[6]==' ' && local[7]=='B' && local[8]==' ') //BetaMinusRecord
        {
          if(local[5]==' ')
            {
              flushAllStacksExcept(0);
            }
          else
            {
              if(myRecordStack[BetaMinusRecord_].empty())
                throw DataFileException("No BetaMinusRecord to continue: \n" + local);
              flushAllStacksExcept(1,BetaMinusRecord_);
            }
          myRecordStack[BetaMinusRecord_].push_back(local);
        }
      else if(local[6]==' ' && local[7]=='E' && local[8]==' ') //BetaPlusRecord
        {
          if(local[5]==' ')
            {
              flushAllStacksExcept(0);
            }
          else
            {
              if(myRecordStack[BetaPlusRecord_].empty())
                throw DataFileException("No BetaPlusRecord to continue: \n" + local);
              flushAllStacksExcept(1,BetaPlusRecord_);
            }
          myRecordStack[BetaPlusRecord_].push_back(local);
        }
      else if(local[6]==' ' && local[7]=='A' && local[8]==' ') //AlphaRecord
        {
          if(local[5]==' ')
            {
              flushAllStacksExcept(0);
            }
          else
            {
              if(myRecordStack[AlphaRecord_].empty())
                throw DataFileException("No AlphaRecord to continue: \n" + local);
              flushAllStacksExcept(1,AlphaRecord_);
            }
          myRecordStack[AlphaRecord_].push_back(local);
        }
      else if(local[6]==' ' && (local[7]=='D' || local[7]==' ') && (local[8]=='N' || local[8]=='P' || local[8]=='A')) //DelayedParticleRecord
        {
          if(local[5]==' ')
            {
              flushAllStacksExcept(0);
            }
          else
            {
              if(myRecordStack[DelayedParticleRecord_].empty())
                throw DataFileException("No DelayedParticleRecord to continue: \n" + local);
              flushAllStacksExcept(1,DelayedParticleRecord_);
            }
          myRecordStack[DelayedParticleRecord_].push_back(local);
        }
      else if(local[6]==' ' && local[7]=='G' && local[8]==' ') //GammaRecord
        {
          if(local[5]==' ')
            {
              flushAllStacksExcept(0);
            }
          else
            {
              if(myRecordStack[GammaRecord_].empty())
                throw DataFileException("No GammaRecord to continue: \n" + local);
              flushAllStacksExcept(1,GammaRecord_);
            }
          myRecordStack[GammaRecord_].push_back(local);
        }
      else if(local[3]==' ' && local[4]==' ' && local[5]==' ' && local[6]==' ' && local[7]=='R' && local[8]==' ') //ReferenceRecord
        {
          myRecordStack[CrossReferenceRecord_].push_back(local);
          flushAllStacksExcept(0);
        }
      else //if this occurs, we either have a) a file error or b) an error in this software. Either way, throw an exception.
        {
          throw DataFileException("Invalid entry detected in data files: \n" + local);
        }
    }
  flushAllStacksExcept(0);
  initRecords();
  initBetaRecords();
}

Member Function Documentation

void Dataset::flushSpecific ( RecordType  toFlush) [private]

Flushes a list of accumulated records from myRecordStack. Used upon initializing only.

Parameters:
toFlushthe RecordType to flush.

Definition at line 101 of file Dataset.cpp.

References AlphaRecord_, BetaMinusRecord_, BetaPlusRecord_, CommentRecord_, CrossReferenceRecord_, DelayedParticleRecord_, GammaRecord_, HistoryRecord_, IdentificationRecord_, LevelRecord_, myAlphaRecords, myBetaMinusRecords, myBetaPlusRecords, myCommentRecords, myCrossReferenceRecords, myDelayedParticleRecords, myGammaRecords, myHistoryRecords, myIdentificationRecords, myLevelRecords, myNormalizationRecords, myParentRecords, myProductionNormalizationRecords, myQValueRecords, myRecordStack, NormalizationRecord_, NULL, ParentRecord_, and ProductionNormalizationRecord_.

Referenced by Dataset(), and flushAllStacksExcept().

{
  if(myRecordStack.find(toFlush)==myRecordStack.end())
    throw DataFileException("Cannot flush the attempted list.");
  
  for(map<RecordType, list<string> >::iterator it = myRecordStack.begin(); it!=myRecordStack.end(); it++)
    {
      if(it->first==toFlush)
        {
          if(!it->second.empty())
            {
              ParentRecord * ParentReference = NULL;
              if(!myParentRecords.empty())
                  ParentReference = myParentRecords.back();
              NormalizationRecord * NormalizationReference = NULL;
              if(!myNormalizationRecords.empty())
                NormalizationReference = myNormalizationRecords.back();
              QValueRecord * QValueReference = NULL;
              if(!myQValueRecords.empty())
                QValueReference = myQValueRecords.back();
              LevelRecord * LevelReference = NULL;
              if(!myLevelRecords.empty())
                LevelReference = myLevelRecords.back();
              if(toFlush==IdentificationRecord_)
                {
                  myIdentificationRecords.push_back(new IdentificationRecord(it->second));
                }
              else if(toFlush==HistoryRecord_)
                {
                  myHistoryRecords.push_back(new HistoryRecord(it->second));
                }
              else if(toFlush==CrossReferenceRecord_)
                {
                  myCrossReferenceRecords.push_back(new CrossReferenceRecord(it->second));
                }
              else if(toFlush==CommentRecord_)
                {
                  myCommentRecords.push_back(new CommentRecord(it->second));
                }
              else if(toFlush==ParentRecord_)
                {
                  myParentRecords.push_back(new ParentRecord(it->second));
                }
              else if(toFlush==NormalizationRecord_)
                {
                  myNormalizationRecords.push_back(new NormalizationRecord(it->second, ParentReference));
                }
              else if(toFlush==ProductionNormalizationRecord_)
                {
                  myProductionNormalizationRecords.push_back(new ProductionNormalizationRecord(it->second, NormalizationReference));
                }
              else if(toFlush==LevelRecord_)
                {
                  myLevelRecords.push_back(new LevelRecord(it->second, QValueReference));
                }
              else if(toFlush==BetaMinusRecord_)
                {
                  myBetaMinusRecords.push_back(new BetaMinusRecord(it->second, NormalizationReference, LevelReference, ParentReference, QValueReference));
                }
              else if(toFlush==BetaPlusRecord_)
                {
                  myBetaPlusRecords.push_back(new BetaPlusRecord(it->second, NormalizationReference, LevelReference, ParentReference, QValueReference));
                }
              else if(toFlush==AlphaRecord_)
                {
                  myAlphaRecords.push_back(new AlphaRecord(it->second, NormalizationReference, LevelReference, ParentReference, QValueReference));
                }
              else if(toFlush==DelayedParticleRecord_)
                {
                  myDelayedParticleRecords.push_back(new DelayedParticleRecord(it->second, NormalizationReference, LevelReference, ParentReference, QValueReference));
                }
              else if(toFlush==GammaRecord_)
                {
                  myGammaRecords.push_back(new GammaRecord(it->second, LevelReference, NormalizationReference, QValueReference));
                }
              it->second.clear();
            }
        }
    }
}
list< AlphaRecord * > Dataset::getAlphaRecords ( ) const
Returns:
the AlphaRecords in this Dataset.

Definition at line 539 of file Dataset.cpp.

References myAlphaRecords.

{
  return myAlphaRecords;
}
list< BetaMinusRecord * > Dataset::getBetaMinusRecords ( ) const
Returns:
the BetaMinusRecords in this Dataset.

Definition at line 529 of file Dataset.cpp.

References myBetaMinusRecords.

{
  return myBetaMinusRecords;
}
list< BetaPlusRecord * > Dataset::getBetaPlusRecords ( ) const
Returns:
the BetaPlusRecords in this Dataset.

Definition at line 534 of file Dataset.cpp.

References myBetaPlusRecords.

{
  return myBetaPlusRecords;
}
list< BetaRecordWrapper * > Dataset::getBetaRecords ( ) const
Returns:
All the BetaPlusRecord and BetaMinusRecords in this Dataset, casted to BetaRecordWrapper's.

Definition at line 431 of file Dataset.cpp.

References myBetaRecords.

Referenced by Dataset_TEST::Dataset_CreatesCorrectBetaRecords_AssertTrue().

{
  return myBetaRecords;
}
list< CommentRecord * > Dataset::getCommentRecords ( ) const
Returns:
The CommentRecords in this Dataset.

Definition at line 504 of file Dataset.cpp.

References myCommentRecords.

{
  return myCommentRecords;
}
list< CrossReferenceRecord * > Dataset::getCrossReferenceRecords ( ) const
Returns:
The CrossReferenceRecords in this Dataset.

Definition at line 499 of file Dataset.cpp.

References myCrossReferenceRecords.

list< DelayedParticleRecord * > Dataset::getDelayedParticleRecords ( ) const
Returns:
the DelayedParticleRecords in this Dataset.

Definition at line 544 of file Dataset.cpp.

References myDelayedParticleRecords.

list< GammaRecord * > Dataset::getGammaRecords ( ) const
Returns:
the GammaRecords in this Dataset.

Definition at line 549 of file Dataset.cpp.

References myGammaRecords.

{
  return myGammaRecords;
}
list< HistoryRecord * > Dataset::getHistoryRecords ( ) const
Returns:
The HistoryRecords in this Dataset.

Definition at line 494 of file Dataset.cpp.

References myHistoryRecords.

{
  return myHistoryRecords;
}
list< IdentificationRecord * > Dataset::getIdentificationRecords ( ) const
Returns:
The IdentificationRecords in this Dataset.

Definition at line 489 of file Dataset.cpp.

References myIdentificationRecords.

list< LevelRecord * > Dataset::getLevelRecords ( ) const
Returns:
the LevelRecords in this Dataset.

Definition at line 524 of file Dataset.cpp.

References myLevelRecords.

Referenced by DataQueryBetaGamma::findNextLevel(), and Record_TEST::testLevelHalfLife().

{
  return myLevelRecords;
}
list< NormalizationRecord * > Dataset::getNormalizationRecords ( ) const
Returns:
the NormalizationRecords in this Dataset.

Definition at line 514 of file Dataset.cpp.

References myNormalizationRecords.

list< ParentRecord * > Dataset::getParentRecords ( ) const
Returns:
the ParentRecords in this Dataset.

Definition at line 509 of file Dataset.cpp.

References myParentRecords.

{
  return myParentRecords;
}
list< ProductionNormalizationRecord * > Dataset::getProductionNormalizationRecords ( ) const
Returns:
the ProductionNormalizationRecords in this Dataset.

Definition at line 519 of file Dataset.cpp.

References myProductionNormalizationRecords.

list< Record * > Dataset::getRecords ( ) const
Returns:
All the Records in this dataset.

Definition at line 426 of file Dataset.cpp.

References myRecords.

{
  return myRecords;
}
list< ReferenceRecord * > Dataset::getReferenceRecords ( ) const
Returns:
the ReferenceRecords in this Dataset.

Definition at line 554 of file Dataset.cpp.

References myReferenceRecords.

{
  return myReferenceRecords;
}

The documentation for this class was generated from the following files:
 All Classes Files Functions Variables Enumerations Enumerator Defines

Back to the main page of the Precalibrated Ion Beam Identification Detector project

Created by Rikard Lundmark