LightningIRCLI
- class lightning_ir.main.LightningIRCLI(model_class: type[~lightning.pytorch.core.module.LightningModule] | ~typing.Callable[[...], ~lightning.pytorch.core.module.LightningModule] | None = None, datamodule_class: type[~lightning.pytorch.core.datamodule.LightningDataModule] | ~typing.Callable[[...], ~lightning.pytorch.core.datamodule.LightningDataModule] | None = None, save_config_callback: type[~lightning.pytorch.cli.SaveConfigCallback] | None = <class 'lightning.pytorch.cli.SaveConfigCallback'>, save_config_kwargs: dict[str, ~typing.Any] | None = None, trainer_class: type[~lightning.pytorch.trainer.trainer.Trainer] | ~typing.Callable[[...], ~lightning.pytorch.trainer.trainer.Trainer] = <class 'lightning.pytorch.trainer.trainer.Trainer'>, trainer_defaults: dict[str, ~typing.Any] | None = None, seed_everything_default: bool | int = True, parser_kwargs: dict[str, ~typing.Any] | dict[str, dict[str, ~typing.Any]] | None = None, parser_class: type[~lightning.pytorch.cli.LightningArgumentParser] = <class 'lightning.pytorch.cli.LightningArgumentParser'>, subclass_mode_model: bool = False, subclass_mode_data: bool = False, args: list[str] | dict[str, ~typing.Any] | ~jsonargparse._namespace.Namespace | None = None, run: bool = True, auto_configure_optimizers: bool = True, load_from_checkpoint_support: bool = True)[source]
Bases:
LightningCLI- __init__(model_class: type[~lightning.pytorch.core.module.LightningModule] | ~typing.Callable[[...], ~lightning.pytorch.core.module.LightningModule] | None = None, datamodule_class: type[~lightning.pytorch.core.datamodule.LightningDataModule] | ~typing.Callable[[...], ~lightning.pytorch.core.datamodule.LightningDataModule] | None = None, save_config_callback: type[~lightning.pytorch.cli.SaveConfigCallback] | None = <class 'lightning.pytorch.cli.SaveConfigCallback'>, save_config_kwargs: dict[str, ~typing.Any] | None = None, trainer_class: type[~lightning.pytorch.trainer.trainer.Trainer] | ~typing.Callable[[...], ~lightning.pytorch.trainer.trainer.Trainer] = <class 'lightning.pytorch.trainer.trainer.Trainer'>, trainer_defaults: dict[str, ~typing.Any] | None = None, seed_everything_default: bool | int = True, parser_kwargs: dict[str, ~typing.Any] | dict[str, dict[str, ~typing.Any]] | None = None, parser_class: type[~lightning.pytorch.cli.LightningArgumentParser] = <class 'lightning.pytorch.cli.LightningArgumentParser'>, subclass_mode_model: bool = False, subclass_mode_data: bool = False, args: list[str] | dict[str, ~typing.Any] | ~jsonargparse._namespace.Namespace | None = None, run: bool = True, auto_configure_optimizers: bool = True, load_from_checkpoint_support: bool = True) None
Receives as input pytorch-lightning classes (or callables which return pytorch-lightning classes), which are called / instantiated using a parsed configuration file and / or command line args.
Parsing of configuration from environment variables can be enabled by setting
parser_kwargs={"default_env": True}. A full configuration yaml would be parsed fromPL_CONFIGif set. Individual settings are so parsed from variables named for examplePL_TRAINER__MAX_EPOCHS.For more info, read the CLI docs.
- Parameters:
model_class – An optional
LightningModuleclass to train on or a callable which returns aLightningModuleinstance when called. IfNone, you can pass a registered model with--model=MyModel.datamodule_class – An optional
LightningDataModuleclass or a callable which returns aLightningDataModuleinstance when called. IfNone, you can pass a registered datamodule with--data=MyDataModule.save_config_callback – A callback class to save the config.
save_config_kwargs – Parameters that will be used to instantiate the save_config_callback.
trainer_class – An optional subclass of the
Trainerclass or a callable which returns aTrainerinstance when called.trainer_defaults – Set to override Trainer defaults or add persistent callbacks. The callbacks added through this argument will not be configurable from a configuration file and will always be present for this particular CLI. Alternatively, configurable callbacks can be added as explained in the CLI docs.
seed_everything_default – Number for the
seed_everything()seed value. Set to True to automatically choose a seed value. Setting it to False will avoid callingseed_everything.parser_kwargs – Additional arguments to instantiate each
LightningArgumentParser.subclass_mode_model – Whether model can be any subclass of the given class.
subclass_mode_data –
Whether datamodule can be any subclass of the given class.
args – Arguments to parse. If
Nonethe arguments are taken fromsys.argv. Command line style arguments can be given in alist. Alternatively, structured config options can be given in adictorjsonargparse.Namespace.run – Whether subcommands should be added to run a
Trainermethod. If set toFalse, the trainer and model classes will be instantiated only.auto_configure_optimizers – Whether to automatically add default optimizer and lr_scheduler arguments.
load_from_checkpoint_support – Whether
save_hyperparametersshould save the original parsed hyperparameters (instead of what__init__receives), such that it is possible forload_from_checkpointto correctly instantiate classes even when using complex nesting and dependency injection.
Methods
add_arguments_to_parser(parser)Implement to add extra arguments to the parser or link arguments.
configure_optimizers(lightning_module, optimizer)Override to customize the
configure_optimizers()method.Defines the list of available subcommands and the arguments to skip.
- add_arguments_to_parser(parser)[source]
Implement to add extra arguments to the parser or link arguments.
- Parameters:
parser – The parser object to which arguments can be added
- static configure_optimizers(lightning_module: LightningModule, optimizer: Optimizer, lr_scheduler: WarmupLRScheduler | None = None) Any[source]
Override to customize the
configure_optimizers()method.- Parameters:
lightning_module – A reference to the model.
optimizer – The optimizer.
lr_scheduler – The learning rate scheduler (if used).
- static subcommands() Dict[str, Set[str]][source]
Defines the list of available subcommands and the arguments to skip.