Quickstart
This guide will get you started with SPARKX quickly. SPARKX is designed to help users of SMASH and JETSCAPE/X-SCAPE codes analyze output simulation data with ease.
Key Components of SPARKX
The following classes will be useful for loading, filtering, and analyzing your SMASH or JETSCAPE/X-SCAPE data:
Data Loading and Filtering
Event Analysis
For more advanced analysis with individual particle objects, SPARKX offers a range of classes for calculating centrality, flow, and more:
BulkObservables: Calculates bulk observables like \(\frac{dN}{dy}\) and midrapdity yields.
CentralityClasses: Calculates centrality classes for a given set of events.
EventCharacteristics: Calculates event characteristics like eccentricities and energy densities, which can be used as input for hydrodynamical simulations.
JetAnalysis: Finds jets in the events using FastJet.
MultiParticlePtCorrelations: Calculates multi-particle transverse momentum correlations.
Flow Calculations
SPARKX provides several classes for calculating anisotropic flow:
EventPlaneFlow: Calculates event plane flow.
LeeYangZeroFlow: Calculates the Lee-Yang zero flow.
PCAFlow: Calculates flow using principal component analysis.
QCumulantFlow: Calculates flow using Q-cumulants.
ReactionPlaneFlow: Calculates flow using the reaction plane method.
ScalarProductFlow: Calculates flow using the scalar product method.
Additional Tools
Histogram: Creates histograms.
Jackknife: Calculates delete-d jackknife errors.
Lattice3D: Handles 3D lattices.
Particle: Handles individual particle (hadron or parton) objects.
For detailed information on these classes and their methods, see the classes documentation.
Installation
For information on how to install the package, please refer to the installation documentation.
Basic Usage Example
Once installed, here’s a simple example to load data and perform some basic filtering and analysis:
from sparkx import Oscar
from sparkx import Histogram
# Load data using the Oscar class and apply some filters, return Particle objects
all_filters={'charged_particles': True, 'pT_cut': (0.15, 3.5), 'pseudorapidity_cut': 0.5}
data = Oscar("./particle_lists.oscar", filters=all_filters).particle_objects_list()
# data contains now the lists of Particle objects for each event that passed the filters
# assume that we want to create a charged hadron transverse momentum histogram
# extract the transverse momentum of each particle for all events
pT = [particle.pT_abs() for event in data for particle in event]
# create a histogram of the transverse momentum with 10 bins between 0.15 and 3.5 GeV
hist = Histogram(bin_boundaries=(0.15, 3.5, 10))
hist.add_value(pT) # add the data to the histogram
hist.statistical_error() # calculate the statistical errors assuming Poisson statistics
hist.scale_histogram(1./len(data)) # normalize the histogram to the number of events
# define the names of the columns in the output file
column_labels = [{'bin_center': 'pT [GeV]',
'bin_low': 'pT_low [GeV]',
'bin_high': 'pT_high [GeV]',
'distribution': 'dN/dpT [1/GeV]',
'stat_err+': 'stat_err+',
'stat_err-': 'stat_err-',
'sys_err+': 'sys_err+',
'sys_err-': 'sys_err-'}]
# write the histogram to a file
hist.write_to_file('pT_histogram.dat', hist_labels=column_labels)
You can find more examples in the documentation for the individual classes.
Troubleshooting
If you encounter issues:
Ensure all dependencies are installed with the correct versions.
Check the GitHub repository to report bugs or view open issues.