LightningIRCLI

class lightning_ir.main.LightningIRCLI(model_class: type[~lightning.pytorch.core.module.LightningModule] | ~typing.Callable[[...], ~lightning.pytorch.core.module.LightningModule] | None = None, datamodule_class: type[~lightning.pytorch.core.datamodule.LightningDataModule] | ~typing.Callable[[...], ~lightning.pytorch.core.datamodule.LightningDataModule] | None = None, save_config_callback: type[~lightning.pytorch.cli.SaveConfigCallback] | None = <class 'lightning.pytorch.cli.SaveConfigCallback'>, save_config_kwargs: dict[str, ~typing.Any] | None = None, trainer_class: type[~lightning.pytorch.trainer.trainer.Trainer] | ~typing.Callable[[...], ~lightning.pytorch.trainer.trainer.Trainer] = <class 'lightning.pytorch.trainer.trainer.Trainer'>, trainer_defaults: dict[str, ~typing.Any] | None = None, seed_everything_default: bool | int = True, parser_kwargs: dict[str, ~typing.Any] | dict[str, dict[str, ~typing.Any]] | None = None, parser_class: type[~lightning.pytorch.cli.LightningArgumentParser] = <class 'lightning.pytorch.cli.LightningArgumentParser'>, subclass_mode_model: bool = False, subclass_mode_data: bool = False, args: list[str] | dict[str, ~typing.Any] | ~jsonargparse._namespace.Namespace | None = None, run: bool = True, auto_configure_optimizers: bool = True, load_from_checkpoint_support: bool = True)[source]

Bases: LightningCLI

__init__(model_class: type[~lightning.pytorch.core.module.LightningModule] | ~typing.Callable[[...], ~lightning.pytorch.core.module.LightningModule] | None = None, datamodule_class: type[~lightning.pytorch.core.datamodule.LightningDataModule] | ~typing.Callable[[...], ~lightning.pytorch.core.datamodule.LightningDataModule] | None = None, save_config_callback: type[~lightning.pytorch.cli.SaveConfigCallback] | None = <class 'lightning.pytorch.cli.SaveConfigCallback'>, save_config_kwargs: dict[str, ~typing.Any] | None = None, trainer_class: type[~lightning.pytorch.trainer.trainer.Trainer] | ~typing.Callable[[...], ~lightning.pytorch.trainer.trainer.Trainer] = <class 'lightning.pytorch.trainer.trainer.Trainer'>, trainer_defaults: dict[str, ~typing.Any] | None = None, seed_everything_default: bool | int = True, parser_kwargs: dict[str, ~typing.Any] | dict[str, dict[str, ~typing.Any]] | None = None, parser_class: type[~lightning.pytorch.cli.LightningArgumentParser] = <class 'lightning.pytorch.cli.LightningArgumentParser'>, subclass_mode_model: bool = False, subclass_mode_data: bool = False, args: list[str] | dict[str, ~typing.Any] | ~jsonargparse._namespace.Namespace | None = None, run: bool = True, auto_configure_optimizers: bool = True, load_from_checkpoint_support: bool = True) None

Receives as input pytorch-lightning classes (or callables which return pytorch-lightning classes), which are called / instantiated using a parsed configuration file and / or command line args.

Parsing of configuration from environment variables can be enabled by setting parser_kwargs={"default_env": True}. A full configuration yaml would be parsed from PL_CONFIG if set. Individual settings are so parsed from variables named for example PL_TRAINER__MAX_EPOCHS.

For more info, read the CLI docs.

Parameters:
  • model_class – An optional LightningModule class to train on or a callable which returns a LightningModule instance when called. If None, you can pass a registered model with --model=MyModel.

  • datamodule_class – An optional LightningDataModule class or a callable which returns a LightningDataModule instance when called. If None, you can pass a registered datamodule with --data=MyDataModule.

  • save_config_callback – A callback class to save the config.

  • save_config_kwargs – Parameters that will be used to instantiate the save_config_callback.

  • trainer_class – An optional subclass of the Trainer class or a callable which returns a Trainer instance when called.

  • trainer_defaults – Set to override Trainer defaults or add persistent callbacks. The callbacks added through this argument will not be configurable from a configuration file and will always be present for this particular CLI. Alternatively, configurable callbacks can be added as explained in the CLI docs.

  • seed_everything_default – Number for the seed_everything() seed value. Set to True to automatically choose a seed value. Setting it to False will avoid calling seed_everything.

  • parser_kwargs – Additional arguments to instantiate each LightningArgumentParser.

  • subclass_mode_model – Whether model can be any subclass of the given class.

  • subclass_mode_data

    Whether datamodule can be any subclass of the given class.

  • args – Arguments to parse. If None the arguments are taken from sys.argv. Command line style arguments can be given in a list. Alternatively, structured config options can be given in a dict or jsonargparse.Namespace.

  • run – Whether subcommands should be added to run a Trainer method. If set to False, the trainer and model classes will be instantiated only.

  • auto_configure_optimizers – Whether to automatically add default optimizer and lr_scheduler arguments.

  • load_from_checkpoint_support – Whether save_hyperparameters should save the original parsed hyperparameters (instead of what __init__ receives), such that it is possible for load_from_checkpoint to correctly instantiate classes even when using complex nesting and dependency injection.

Methods

add_arguments_to_parser(parser)

Implement to add extra arguments to the parser or link arguments.

configure_optimizers(lightning_module, optimizer)

Override to customize the configure_optimizers() method.

subcommands()

Defines the list of available subcommands and the arguments to skip.

add_arguments_to_parser(parser)[source]

Implement to add extra arguments to the parser or link arguments.

Parameters:

parser – The parser object to which arguments can be added

static configure_optimizers(lightning_module: LightningModule, optimizer: Optimizer, lr_scheduler: WarmupLRScheduler | None = None) Any[source]

Override to customize the configure_optimizers() method.

Parameters:
  • lightning_module – A reference to the model.

  • optimizer – The optimizer.

  • lr_scheduler – The learning rate scheduler (if used).

static subcommands() Dict[str, Set[str]][source]

Defines the list of available subcommands and the arguments to skip.