Tutorials

The tutorials below walk through common SkyLLH analysis tasks using IceCube public data. They progress from a basic steady-state point-source fit to more specialised techniques.

The tutorials support both the 10-year (IceTracks-DR1) and 14-year (TODO: add link) (IceTracks-DR2) IceCube public point-source datasets. They are automatically downloaded from dataverse.harvard.edu to a local cache directory (~/.skyllh/cache). To use custom dataset locations, set the cfg['repository']['base_path'] to the desired path.

To load respective datasets:

import skyllh
from skyllh.core.config import Config

cfg = Config()

dsc = create_dataset_collection(cfg=cfg)
datasets = skyllh.create_datasets('IceTracks-DR2', cfg=cfg)

or

datasets = skyllh.create_datasets('IceTracks-DR1', cfg=cfg)

Additional information about the IceCube public datasets can be found in the following references:

We provide an incomplete list of tutorials below. They are meant to illustrate how to perform a time-integrated point-source analysis using SkyLLH, but they are not exhaustive. We encourage users to explore the documentation and contribute additional tutorials covering other analysis types and techniques.

Fitting a steady point-source with the public 14-year IceCube track data

Fit a steady point source (NGC 1068) using both the IceCube 10-year and 14-year public track data. Covers loading datasets, maximising the log-likelihood ratio, computing the test statistic, and deriving flux normalisations.

Dataset collections

Introduces the concept of dataset collections, which are used to manage multiple datasets in a unified way. Covers loading datasets, inspecting available datasets, and accessing individual datasets.

Fixing the Spectral Index

Repeat the point-source fit with a fixed (non-free) spectral index. Also demonstrates converting between mean signal event counts and flux normalisations.

Setting a user-defined energy range for the signal injection

Set a user-defined energy range for the signal injection. Illustrates the distinction between the signal-injection energy range and the likelihood hypothesis energy range.

Producing a Sky Scan with IceCube Public 14-year data

Produce a 2-D test-statistic map by scanning source positions on a sky grid around a candidate source. Visualises best-fit position and confidence contours using Wilks’ theorem.

Producing a likelihood scan using IceCube 14-year public data

Scan the log-likelihood ratio over a 2-D grid of flux normalisation and spectral index. Produces confidence contours in the gamma-flux parameter space.

Illustrating different approaches to computing the significance (local p-value) using SkyLLH

Compare two methods for computing a local p-value from a test statistic: generating background-only trials versus fitting a truncated gamma distribution to the TS distribution.

Sensitivity and Discovery Potential

Estimate sensitivity and discovery potential for a point source. Demonstrates the analysis construction and how to convert signal counts to flux.

Smearing Matrix and Effective Area

Access and visualise IceCube instrument response functions: the 5-D smearing matrix and the effective area as a function of energy and declination.

Time dependent analyses with the public 10-year IceCube point-source data

Perform a time-dependent point-source analysis. Uses the Expectation-Maximisation (EM) algorithm to fit the timing and duration of a neutrino flare (TXS 0506+056).