.. _quickstart:
Quickstart
==========
This guide will get you started with SPARKX quickly.
SPARKX is designed to help users of `SMASH `_
and `JETSCAPE/X-SCAPE `_ codes analyze output simulation data with ease.
Key Components of SPARKX
------------------------
The following classes will be useful for loading, filtering, and analyzing your SMASH or JETSCAPE/X-SCAPE data:
Data Loading and Filtering
~~~~~~~~~~~~~~~~~~~~~~~~~~
* `Jetscape `_: Reads hadron or parton output from JETSCAPE/X-SCAPE and allows particle filtering.
* `Oscar `_: Reads various Oscar data formats and provides particle filtering methods.
Event Analysis
~~~~~~~~~~~~~~
For more advanced analysis with individual particle objects,
SPARKX offers a range of classes for calculating centrality, flow, and more:
* `BulkObservables `_: Calculates bulk observables like :math:`\frac{dN}{dy}` and midrapdity yields.
* `CentralityClasses `_: Calculates centrality classes for a given set of events.
* `EventCharacteristics `_: Calculates event characteristics like eccentricities and energy densities, which can be used as input for hydrodynamical simulations.
* `JetAnalysis `_: Finds jets in the events using `FastJet `_.
* `MultiParticlePtCorrelations `_: Calculates multi-particle transverse momentum correlations.
Flow Calculations
~~~~~~~~~~~~~~~~~
SPARKX provides several classes for calculating anisotropic flow:
* `EventPlaneFlow `_: Calculates event plane flow.
* `LeeYangZeroFlow `_: Calculates the Lee-Yang zero flow.
* `PCAFlow `_: Calculates flow using principal component analysis.
* `QCumulantFlow `_: Calculates flow using Q-cumulants.
* `ReactionPlaneFlow `_: Calculates flow using the reaction plane method.
* `ScalarProductFlow `_: Calculates flow using the scalar product method.
Additional Tools
~~~~~~~~~~~~~~~~
* `Histogram `_: Creates histograms.
* `Jackknife `_: Calculates delete-d jackknife errors.
* `Lattice3D `_: Handles 3D lattices.
* `Particle `_: Handles individual particle (hadron or parton) objects.
For detailed information on these classes and their methods, see the `classes documentation `_.
Installation
------------
For information on how to install the package, please refer to the `installation documentation `_.
Basic Usage Example
-------------------
Once installed, here's a simple example to load data and perform some basic filtering and analysis:
.. code-block:: python
from sparkx import Oscar
from sparkx import Histogram
# Load data using the Oscar class and apply some filters, return Particle objects
all_filters={'charged_particles': True, 'pT_cut': (0.15, 3.5), 'pseudorapidity_cut': 0.5}
data = Oscar("./particle_lists.oscar", filters=all_filters).particle_objects_list()
# data contains now the lists of Particle objects for each event that passed the filters
# assume that we want to create a charged hadron transverse momentum histogram
# extract the transverse momentum of each particle for all events
pT = [particle.pT_abs() for event in data for particle in event]
# create a histogram of the transverse momentum with 10 bins between 0.15 and 3.5 GeV
hist = Histogram(bin_boundaries=(0.15, 3.5, 10))
hist.add_value(pT) # add the data to the histogram
hist.statistical_error() # calculate the statistical errors assuming Poisson statistics
hist.scale_histogram(1./(len(data)*hist.bin_width())) # normalize the histogram to the number of events and divide by the bin width
# define the names of the columns in the output file
column_labels = [{'bin_center': 'pT [GeV]',
'bin_low': 'pT_low [GeV]',
'bin_high': 'pT_high [GeV]',
'distribution': '1/N_ev * dN/dpT [1/GeV]',
'stat_err+': 'stat_err+',
'stat_err-': 'stat_err-',
'sys_err+': 'sys_err+',
'sys_err-': 'sys_err-'}]
# write the histogram to a file
hist.write_to_file('pT_histogram.dat', hist_labels=column_labels)
You can find more examples in the documentation for the individual classes.
Troubleshooting
---------------
If you encounter issues:
* Ensure all dependencies are installed with the correct versions.
* Check the `GitHub repository `_ to report bugs or view open issues.