debugtools

debugtools

Utilities to help with debugging Starsim runs

Classes

Name Description
Debugger Step through one or more sims and pause or raise an exception when a condition is met
Profile Class to profile the performance of a simulation

Debugger

debugtools.Debugger(
    self,
    *args,
    func,
    skip=None,
    verbose=True,
    die=True,
    run=False,
)

Step through one or more sims and pause or raise an exception when a condition is met

Parameters

Name Type Description Default
args list the sim or sims to step through ()
func func / str the function to run on the sims (or can use a built-in one, e.g. ‘equal’) required
skip list if provided, additional object names/types to skip (not check) None
verbose bool whether to print progress during the run True
die bool whether to raise an exception if the condition is met; alternatively ‘pause’ will pause execution, and False will just print True
run bool whether to run immediately False

Examples:

## Example 1: Identical sims are identical
import starsim as ss

s1 = ss.Sim(pars=dict(diseases='sis', networks='random'), n_agents=250)
s2 = s1.copy()
s3 = s1.copy()
db = ss.Debugger(s1, s2, s3, func='equal')
db.run()

## Example 2: Pause when sim results diverge
import sciris as sc
import starsim as ss

# Set up the sims
pars = dict(networks='random', n_agents=1000, verbose=0)
sis1 = ss.SIS(beta=0.050)
sis2 = ss.SIS(beta=0.051) # Very slightly more
s1 = ss.Sim(pars, diseases=sis1)
s2 = ss.Sim(pars, diseases=sis2)

# Run the debugger
db = ss.Debugger(s1, s2, func='equal_results', die='pause')
db.run()

# Show non-matching results
sc.heading('Differing results')
df = db.results[-1].df
df = df[~df['equal']]
df.disp()

Methods

Name Description
equal Run all other tests
equal_check Check if equality is false, and print a message or die
equal_dists Check if the dists are equal
equal_pars Check if SimPars are equal
equal_people Check if people are equal
equal
debugtools.Debugger.equal(*sims)

Run all other tests

equal_check
debugtools.Debugger.equal_check(e, which)

Check if equality is false, and print a message or die

equal_dists
debugtools.Debugger.equal_dists(*sims)

Check if the dists are equal

equal_pars
debugtools.Debugger.equal_pars(*sims, skip='label')

Check if SimPars are equal

equal_people
debugtools.Debugger.equal_people(*sims)

Check if people are equal

Profile

debugtools.Profile(
    self,
    sim,
    follow=None,
    do_run=True,
    plot=True,
    verbose=True,
    **kwargs,
)

Class to profile the performance of a simulation

Typically invoked via sim.profile().

Parameters

Name Type Description Default
sim ss.Sim the sim to profile required
follow func / list a list of functions/methods to follow in detail None
do_run bool whether to immediately run the sim True
plot bool whether to plot time spent per module step True
**kwargs dict passed to sc.profile() {}

Example:

import starsim as ss

net = ss.RandomNet()
sis = ss.SIS()
sim = ss.Sim(networks=net, diseases=sis)
prof = sim.profile(follow=[net.add_pairs, sis.infect])
prof.disp()

Methods

Name Description
disp Same as sc.profile.disp(), but skip the run function by default
plot_cpu Shortcut to sim.loop.plot_cpu()
profile_init Handle sim init – both run it and profile it
run Profile the performance of the simulation
disp
debugtools.Profile.disp(bytime=1, maxentries=10, skiprun=True)

Same as sc.profile.disp(), but skip the run function by default

plot_cpu
debugtools.Profile.plot_cpu()

Shortcut to sim.loop.plot_cpu()

profile_init
debugtools.Profile.profile_init()

Handle sim init – both run it and profile it

run
debugtools.Profile.run()

Profile the performance of the simulation

Functions

Name Description
check_requires Check that the module’s requirements (of other modules) are met
check_version Check the expected Starsim version with the one actually installed. The expected
metadata Store metadata; like sc.metadata(), but optimized for speed
mock_module Create a minimal mock “Module” object
mock_people Create a minimal mock “People” object
mock_sim Create a minimal mock “Sim” object to initialize objects that require it
mock_time Create a minimal mock “Time” object

check_requires

debugtools.check_requires(sim, requires, *args)

Check that the module’s requirements (of other modules) are met

check_version

debugtools.check_version(expected, die=False, warn=True)

Check the expected Starsim version with the one actually installed. The expected version string may optionally start with ‘>=’ or ‘<=’ (== is implied otherwise), but other operators (e.g. ~=) are not supported. Note that ‘>’ and ‘<’ are interpreted to mean ‘>=’ and ‘<=’; ‘>’ and ‘<’ are not supported.

Parameters

Name Type Description Default
expected str expected version information required
die bool whether or not to raise an exception if the check fails False
warn bool whether to raise a warning if the check fails True

Example:

ss.check_version('>=3.0.0', die=True) # Will raise an exception if an older version is used

metadata

debugtools.metadata(comments=None)

Store metadata; like sc.metadata(), but optimized for speed

mock_module

debugtools.mock_module(dur=10, **kwargs)

Create a minimal mock “Module” object

mock_people

debugtools.mock_people(n_agents=100)

Create a minimal mock “People” object

mock_sim

debugtools.mock_sim(n_agents=100, **kwargs)

Create a minimal mock “Sim” object to initialize objects that require it

Parameters

Name Type Description Default
n_agents int the number of agents to create 100
**kwargs dict passed to ss.mock_time() {}

mock_time

debugtools.mock_time(dt=1.0, dur=10, start=2000)

Create a minimal mock “Time” object