dataclass_wizard package

Subpackages

Submodules

dataclass_wizard.abstractions module

Contains implementations for Abstract Base Classes

class dataclass_wizard.abstractions.AbstractDumper[source]

Bases: ABC

class dataclass_wizard.abstractions.AbstractJSONWizard[source]

Bases: ABC

Abstract class that defines the methods a sub-class must implement at a minimum to be considered a “true” JSON Wizard.

In particular, these are the abstract methods which - if correctly implemented - will allow a concrete sub-class (ideally a dataclass) to be properly loaded from, and serialized to, JSON.

abstract classmethod from_dict(o: Dict[str, Any]) W[source]

Converts a Python dict object to an instance of the dataclass.

abstract classmethod from_json(string: AnyStr) W | List[W][source]

Converts a JSON string to an instance of the dataclass, or a list of the dataclass instances.

abstract classmethod from_list(o: List[Dict[str, Any]]) List[W][source]

Converts a Python list object to a list of the dataclass instances.

abstract classmethod list_to_json(instances: ~typing.List[~dataclass_wizard.abstractions.W], encoder: ~dataclass_wizard.type_def.Encoder = <function dumps>, indent=None, **encoder_kwargs) AnyStr[source]

Converts a list of dataclass instances to a JSON string representation.

abstract to_dict() Dict[str, Any][source]

Converts the dataclass instance to a Python dictionary object that is JSON serializable.

abstract to_json(*, encoder: ~dataclass_wizard.type_def.Encoder = <function dumps>, indent=None, **encoder_kwargs) AnyStr[source]

Converts the dataclass instance to a JSON string representation.

class dataclass_wizard.abstractions.AbstractLoader[source]

Bases: ABC

Abstract loader which defines the helper methods that can be used to load an object o into an object of annotated (or concrete) type base_type.

abstract static default_load_to(o: T, _: Any) T[source]

Default load function if no other paths match. Generally, this will be a stub load method.

abstract classmethod get_parser_for_annotation(ann_type: Type[T], base_cls: Type = None, extras: Extras = None) AbstractParser[source]

Returns the Parser (dispatcher) for a given annotation type.

base_cls is the original class object, this is useful when the annotated type is a typing.ForwardRef object

abstract static load_after_type_check(o: Any, base_type: Type[T]) T[source]

Load an object o, after confirming that it is indeed of type base_type.

Raises:

ParseError – If the object is not of the expected type.

abstract static load_to_bool(o: str | bool | N, _: Type[bool]) bool[source]

Load a bool, string, or an numeric value into a new object of type bool.

Note: bool cannot be sub-classed, so the base_type argument is discarded in this case.

abstract static load_to_date(o: str | N, base_type: Type[date]) date[source]

Load a string or number (int or float) into a new object of type base_type (generally a date or a sub-class of one)

abstract static load_to_datetime(o: str | N, base_type: Type[datetime]) datetime[source]

Load a string or number (int or float) into a new object of type base_type (generally a datetime or a sub-class of one)

abstract static load_to_decimal(o: N, base_type: Type[Decimal]) Decimal[source]

Load an object o into a new object of type base_type (generally a Decimal or a sub-class of one)

abstract static load_to_defaultdict(o: Dict, base_type: Type[DD], default_factory: Callable[[], T], key_parser: AbstractParser, val_parser: AbstractParser) DD[source]

Load an object o into a new object of type base_type (generally a collections.defaultdict or a sub-class of one)

abstract static load_to_dict(o: Dict, base_type: Type[M], key_parser: AbstractParser, val_parser: AbstractParser) M[source]

Load an object o into a new object of type base_type (generally a dict or a sub-class of one)

abstract static load_to_enum(o: AnyStr | N, base_type: Type[E]) E[source]

Load an object o into a new object of type base_type (generally a sub-class of the Enum type)

abstract static load_to_float(o: SupportsFloat | str, base_type: Type[N]) N[source]

Load a string or float into a new object of type base_type (generally a sub-class of the float type)

abstract static load_to_int(o: str | int | bool | None, base_type: Type[N]) N[source]

Load a string or int into a new object of type base_type (generally a sub-class of the int type)

abstract static load_to_iterable(o: Iterable, base_type: Type[LSQ], elem_parser: AbstractParser) LSQ[source]

Load a list, set, frozenset or deque into a new object of type base_type (generally a list, set, frozenset, deque, or a sub-class of one)

abstract static load_to_named_tuple(o: Dict | List | Tuple, base_type: Type[NT], field_to_parser: Dict[str, AbstractParser], field_parsers: List[AbstractParser]) NT[source]

Load a dictionary, list, or tuple to a NamedTuple sub-class

abstract static load_to_named_tuple_untyped(o: Dict | List | Tuple, base_type: Type[NT], dict_parser: AbstractParser, list_parser: AbstractParser) NT[source]

Load a dictionary, list, or tuple to a (generally) un-typed collections.namedtuple

abstract static load_to_str(o: str | N | None, base_type: Type[str]) str[source]

Load a string or numeric type into a new object of type base_type (generally a sub-class of the str type)

abstract static load_to_time(o: str, base_type: Type[time]) time[source]

Load a string or number (int or float) into a new object of type base_type (generally a time or a sub-class of one)

abstract static load_to_timedelta(o: str | N, base_type: Type[timedelta]) timedelta[source]

Load a string or number (int or float) into a new object of type base_type (generally a timedelta or a sub-class of one)

abstract static load_to_tuple(o: List | Tuple, base_type: Type[Tuple], elem_parsers: Sequence[AbstractParser]) Tuple[source]

Load a list or tuple into a new object of type base_type (generally a tuple or a sub-class of one)

abstract static load_to_typed_dict(o: Dict, base_type: Type[M], key_to_parser: Dict[str, AbstractParser], required_keys: FrozenSet[str], optional_keys: FrozenSet[str]) M[source]

Load an object o annotated as a TypedDict sub-class into a new object of type base_type (generally a dict or a sub-class of one)

abstract static load_to_uuid(o: AnyStr | U, base_type: Type[U]) U[source]

Load an object o into a new object of type base_type (generally a sub-class of the UUID type)

abstract static transform_json_field(string: str) str[source]

Transform a JSON field name (which will typically be camel-cased) into the conventional format for a dataclass field name (which will ideally be snake-cased).

class dataclass_wizard.abstractions.AbstractParser(cls: dataclasses.InitVar[Type], extras: dataclasses.InitVar[Extras], base_type: Type[T])[source]

Bases: ABC

Abstract parsers, which will ideally act as dispatchers to route objects to the load or dump hook methods responsible for transforming the objects into the annotated type for the dataclass field for which value we want to set. The error handling logic should ideally be implemented on the Parser (dispatcher) side.

There can be more complex Parsers, for example ones which will handle typing.Union, typing.Literal, Dict, and NamedTuple types. There can even be nested Parsers, which will be useful for handling collection and sequence types.

base_type: Type[T]
cls: dataclasses.InitVar[Type]
extras: dataclasses.InitVar[Extras]

dataclass_wizard.bases module

class dataclass_wizard.bases.ABCOrAndMeta(name, bases, namespace, /, **kwargs)[source]

Bases: ABCMeta

Metaclass to add class-level __or__() and __and__() methods to a base class of type :type:`M`.

Ref:
class dataclass_wizard.bases.AbstractMeta[source]

Bases: object

Base class definition for the JSONWizard.Meta inner class.

all_fields = frozenset({'auto_assign_tags', 'debug_enabled', 'json_key_to_field', 'key_transform_with_dump', 'key_transform_with_load', 'marshal_date_time_as', 'raise_on_unknown_json_key', 'recursive', 'skip_defaults', 'tag', 'tag_key'})
auto_assign_tags: ClassVar[bool] = False
abstract classmethod bind_to(dataclass: Type, create=True, is_default=True)[source]

Initialize hook which applies the Meta config to dataclass, which is typically a subclass of JSONWizard.

Parameters:
  • dataclass – A class which has been decorated by the @dataclass decorator; typically this is a sub-class of JSONWizard.

  • create – When true, a separate loader/dumper will be created for the class. If disabled, this will access the root loader/dumper, so modifying this should affect global settings across all dataclasses that use the JSON load/dump process.

  • is_default – When enabled, the Meta will be cached as the default Meta config for the dataclass. Defaults to true.

debug_enabled: ClassVar[bool] = False
fields_to_merge = frozenset({'auto_assign_tags', 'debug_enabled', 'key_transform_with_dump', 'key_transform_with_load', 'marshal_date_time_as', 'raise_on_unknown_json_key', 'skip_defaults', 'tag_key'})
json_key_to_field: ClassVar[Dict[str, str]] = None
key_transform_with_dump: ClassVar[LetterCase | str] = None
key_transform_with_load: ClassVar[LetterCase | str] = None
marshal_date_time_as: ClassVar[DateTimeTo | str] = None
raise_on_unknown_json_key: ClassVar[bool] = False
recursive: ClassVar[bool] = True
skip_defaults: ClassVar[bool] = False
tag: ClassVar[str] = None
tag_key: ClassVar[str] = '__tag__'
class dataclass_wizard.bases.BaseDumpHook[source]

Bases: object

Container class for type hooks.

classmethod get_dump_hook(typ: Type) Callable | None[source]

Retrieves the hook for a type, if one exists.

classmethod register_dump_hook(typ: Type, func: Callable)[source]

Registers the hook for a type, on the default dumper by default.

class dataclass_wizard.bases.BaseLoadHook[source]

Bases: object

Container class for type hooks.

classmethod get_load_hook(typ: Type) Callable | None[source]

Retrieves the hook for a type, if one exists.

classmethod register_load_hook(typ: Type, func: Callable)[source]

Registers the hook for a type, on the default loader by default.

dataclass_wizard.bases_meta module

Ideally should be in the bases module, however we’ll run into a Circular Import scenario if we move it there, since the loaders and dumpers modules both import directly from bases.

class dataclass_wizard.bases_meta.BaseJSONWizardMeta[source]

Bases: AbstractMeta

Superclass definition for the JSONWizard.Meta inner class.

See the implementation of the AbstractMeta class for the available config that can be set, as well as for descriptions on any implemented methods.

classmethod bind_to(dataclass: Type, create=True, is_default=True)[source]

Initialize hook which applies the Meta config to dataclass, which is typically a subclass of JSONWizard.

Parameters:
  • dataclass – A class which has been decorated by the @dataclass decorator; typically this is a sub-class of JSONWizard.

  • create – When true, a separate loader/dumper will be created for the class. If disabled, this will access the root loader/dumper, so modifying this should affect global settings across all dataclasses that use the JSON load/dump process.

  • is_default – When enabled, the Meta will be cached as the default Meta config for the dataclass. Defaults to true.

dataclass_wizard.bases_meta.DumpMeta(*, debug_enabled: bool = False, recursive: bool = True, marshal_date_time_as: DateTimeTo | str = None, key_transform: LetterCase | str = None, tag: str = None, skip_defaults: bool = False) Type[M][source]

Helper function to setup the Meta Config for the JSON dump (serialization) process, which is intended for use alongside the asdict helper function.

For descriptions on what each of these params does, refer to the Docs below, or check out the AbstractMeta definition (I want to avoid duplicating the descriptions for params here).

Examples:

>>> DumpMeta(key_transform='CAMEL').bind_to(MyClass)
>>> asdict(MyClass, {"myStr": "value"})
dataclass_wizard.bases_meta.LoadMeta(*, debug_enabled: bool = False, recursive: bool = True, raise_on_unknown_json_key: bool = False, json_key_to_field: Dict[str, str] = None, key_transform: LetterCase | str = None, tag: str = None) Type[M][source]

Helper function to setup the Meta Config for the JSON load (de-serialization) process, which is intended for use alongside the fromdict helper function.

For descriptions on what each of these params does, refer to the Docs below, or check out the AbstractMeta definition (I want to avoid duplicating the descriptions for params here).

Examples:

>>> LoadMeta(key_transform='CAMEL').bind_to(MyClass)
>>> fromdict(MyClass, {"myStr": "value"})

dataclass_wizard.class_helper module

dataclass_wizard.class_helper.call_meta_initializer_if_needed(cls: Type[W])[source]

Calls the Meta initializer when the inner Meta is sub-classed.

dataclass_wizard.class_helper.create_new_class(class_or_instance, bases: Tuple[T, ...], suffix: str | None = None, attr_dict=None) T[source]

Create (dynamically) and return a new class that sub-classes from a list of bases.

dataclass_wizard.class_helper.dataclass_field_names(cls) Tuple[str, ...][source]

Get the names of all dataclass fields

dataclass_wizard.class_helper.dataclass_field_to_default(cls) Dict[str, Any][source]

Get default values for the (optional) dataclass fields.

dataclass_wizard.class_helper.dataclass_field_to_json_field(cls)[source]

Returns a mapping of dataclass field to JSON field.

dataclass_wizard.class_helper.dataclass_field_to_load_parser(cls_loader: Type[AbstractLoader], cls: Type, config: Type[M], save: bool = True) DictWithLowerStore[str, AbstractParser][source]

Returns a mapping of each lower-cased field name to its annotated type.

dataclass_wizard.class_helper.dataclass_fields(cls) Tuple[Field][source]

Cache the dataclasses.fields() call for each class, as overall that ends up around 5x faster than making a fresh call each time.

dataclass_wizard.class_helper.dataclass_init_fields(cls) Tuple[Field][source]

Get only the dataclass fields that would be passed into the constructor.

dataclass_wizard.class_helper.dataclass_to_dumper(cls: Type)[source]

Returns the dumper for a dataclass.

dataclass_wizard.class_helper.dataclass_to_loader(cls)[source]

Returns the loader for a dataclass.

dataclass_wizard.class_helper.get_class(obj)[source]

Get the class for an object obj

dataclass_wizard.class_helper.get_class_name(class_or_instance) str[source]

Return the fully qualified name of a class.

dataclass_wizard.class_helper.get_meta(cls: Type) Type[M][source]

Retrieves the Meta config for the AbstractJSONWizard subclass.

This config is set when the inner Meta is sub-classed.

dataclass_wizard.class_helper.get_outer_class_name(inner_cls, default=None, raise_=True)[source]

Attempt to return the fully qualified name of the outer (enclosing) class, given a reference to the inner class.

If any errors occur - such as when inner_cls is not a real inner class - then an error will be raised if raise_ is true, and if not we will return default instead.

dataclass_wizard.class_helper.is_subclass(obj, base_cls: Type) bool[source]

Check if obj is a sub-class of base_cls

dataclass_wizard.class_helper.is_subclass_safe(cls, class_or_tuple) bool[source]

Check if obj is a sub-class of base_cls (safer version)

dataclass_wizard.class_helper.json_field_to_dataclass_field(cls: Type)[source]

Returns a mapping of JSON field to dataclass field.

dataclass_wizard.class_helper.set_class_dumper(cls: Type, dumper: Type[AbstractDumper])[source]

Set (and return) the dumper for a dataclass.

dataclass_wizard.class_helper.set_class_loader(class_or_instance, loader: Type[AbstractLoader])[source]

Set (and return) the loader for a dataclass.

dataclass_wizard.class_helper.setup_dump_config_for_cls_if_needed(cls: Type)[source]

This function processes a class cls on an initial run, and sets up the dump process for cls by iterating over each dataclass field. For each field, it performs the following tasks:

  • Check if the field’s annotation is of type Annotated. If so, we iterate over each Annotated argument and find any special JSON objects (this can also be set via the helper function json_key). Assuming we find it, the class-specific mapping of dataclass field name to JSON key is then updated with the input passed in to this object.

  • Check if the field type is a JSONField object (this can also be set by the helper function json_field). Assuming this is the case, the class-specific mapping of dataclass field name to JSON key is then updated with the input passed in to the JSON attribute.

dataclass_wizard.constants module

dataclass_wizard.decorators module

class dataclass_wizard.decorators.cached_class_property(func)[source]

Bases: object

Descriptor decorator implementing a class-level, read-only property, which caches the attribute on-demand on the first use.

Credits: https://stackoverflow.com/a/4037979/10237506

class dataclass_wizard.decorators.cached_property(func)[source]

Bases: object

Descriptor decorator implementing an instance-level, read-only property, which caches the attribute on-demand on the first use.

dataclass_wizard.decorators.discard_kwargs(f)[source]
dataclass_wizard.decorators.resolve_alias_func(f: Callable, _locals: Dict = None, raise_=False) Callable[source]

Resolve the underlying single-arg alias function for f, using the provided function locals (which will be a dict). If f does not have an associated alias function, we return f itself.

Raises:

AttributeError – If raise_ is true and f is not a single-arg alias function.

dataclass_wizard.decorators.try_with_load(load_fn: Callable)[source]

Try to call a load hook, catch and re-raise errors as a ParseError.

Note: this function will be recursively called on all load hooks for a dataclass, when debug_mode is enabled for the dataclass.

Parameters:

load_fn – The load hook, can be a regular callable, a single-arg alias, or an identity function.

Returns:

The decorated load hook.

dataclass_wizard.decorators.try_with_load_with_single_arg(original_fn: Callable, single_arg_load_fn: Callable, base_type: Type)[source]

Similar to try_with_load(), but for single-arg alias functions.

Parameters:
  • original_fn – The original load hook (function)

  • single_arg_load_fn – The single-argument load hook

  • base_type – The annotated (or desired) type

Returns:

The decorated load hook.

dataclass_wizard.dumpers module

The implementation below uses code adapted from the asdict helper function from the library Dataclasses (https://github.com/ericvsmith/dataclasses).

This library is available under the Apache 2.0 license, which can be obtained from http://www.apache.org/licenses/LICENSE-2.0.

See the end of this file for the original Apache license from this library.

class dataclass_wizard.dumpers.DumpMixin[source]

Bases: AbstractDumper, BaseDumpHook

This Mixin class derives its name from the eponymous json.dumps function. Essentially it contains helper methods to convert Python built-in types to a more ‘JSON-friendly’ version.

static default_dump_with(o, *_)[source]
static dump_with_bool(o: bool, *_)[source]
static dump_with_date(o: date, *_)[source]
static dump_with_datetime(o: datetime, *_)[source]
static dump_with_decimal(o: Decimal, *_)[source]
static dump_with_defaultdict(o: DD, _typ: Type[DD], *args)[source]
static dump_with_dict(o: Dict, typ: Type[Dict], *args)[source]
static dump_with_enum(o: E, *_)[source]
static dump_with_float(o: float, *_)[source]
static dump_with_int(o: int, *_)[source]
static dump_with_iterable(o: LSQ, _typ: Type[LSQ], *args)[source]
static dump_with_list_or_tuple(o: LT, typ: Type[LT], *args)[source]
static dump_with_named_tuple(o: NT, typ: Type[NT], *args)[source]
static dump_with_null(o: None, *_)[source]
static dump_with_str(o: str, *_)[source]
static dump_with_time(o: time, *_)[source]
static dump_with_timedelta(o: timedelta, *_)[source]
static dump_with_uuid(o: U, *_)[source]
static transform_dataclass_field(string: str) str

Convert a string to Camel Case.

Examples:

>>> to_camel_case("device_type")
'deviceType'
dataclass_wizard.dumpers.asdict(obj: ~dataclass_wizard.type_def.T, *, cls=None, dict_factory=<class 'dict'>, exclude: ~typing.List[str] = None, **kwargs) Dict[str, Any][source]

Return the fields of a dataclass instance as a new dictionary mapping field names to field values.

Example usage:

@dataclass class C:

x: int y: int

c = C(1, 2) assert asdict(c) == {‘x’: 1, ‘y’: 2}

When directly invoking this function, an optional Meta configuration for the dataclass can be specified via DumpMeta; by default, this will apply recursively to any nested dataclasses. Here’s a sample usage of this below:

>>> DumpMeta(key_transform='CAMEL').bind_to(MyClass)
>>> asdict(MyClass(my_str="value"))

If given, ‘dict_factory’ will be used instead of built-in dict. The function applies recursively to field values that are dataclass instances. This will also look into built-in containers: tuples, lists, and dicts.

dataclass_wizard.dumpers.dump_func_for_dataclass(cls: Type[T], config: Type[M] | None = None, nested_cls_to_dump_func: Dict[Type, Any] = None) Callable[[T, Any, Any, Any], Dict[str, Any]][source]
dataclass_wizard.dumpers.get_dumper(cls=None, create=True) Type[DumpMixin][source]

Get the dumper for the class, using the following logic:

  • Return the class if it’s already a sub-class of DumpMixin

  • If create is enabled (which is the default), a new sub-class of DumpMixin for the class will be generated and cached on the initial run.

  • Otherwise, we will return the base dumper, DumpMixin, which can potentially be shared by more than one dataclass.

dataclass_wizard.dumpers.setup_default_dumper(cls=<class 'dataclass_wizard.dumpers.DumpMixin'>)[source]

Setup the default type hooks to use when converting dataclass instances to str (json)

Note: cls must be DumpMixin or a sub-class of it.

dataclass_wizard.enums module

Re-usable Enum definitions

class dataclass_wizard.enums.DateTimeTo(value, names=None, *values, module=None, qualname=None, type=None, start=1, boundary=None)[source]

Bases: Enum

ISO_FORMAT = 0
TIMESTAMP = 1
class dataclass_wizard.enums.LetterCase(value, names=None, *values, module=None, qualname=None, type=None, start=1, boundary=None)[source]

Bases: Enum

CAMEL = <dataclass_wizard.utils.wrappers.FuncWrapper object>
LISP = <dataclass_wizard.utils.wrappers.FuncWrapper object>
NONE = <dataclass_wizard.utils.wrappers.FuncWrapper object>
PASCAL = <dataclass_wizard.utils.wrappers.FuncWrapper object>
SNAKE = <dataclass_wizard.utils.wrappers.FuncWrapper object>

dataclass_wizard.errors module

exception dataclass_wizard.errors.JSONWizardError[source]

Bases: ABC, Exception

Base error class, for errors raised by this library.

abstract property message: str

Format and return an error message.

exception dataclass_wizard.errors.MissingData(nested_cls: Type, **kwargs)[source]

Bases: ParseError

Error raised when unable to create a class instance, as the JSON object is None.

property message: str

Format and return an error message.

static name(obj) str[source]

Return the type or class name of an object

exception dataclass_wizard.errors.MissingFields(base_err: Exception, obj: Dict[str, Any], cls: Type, cls_kwargs: Dict[str, Any], cls_fields: Tuple[Field], **kwargs)[source]

Bases: JSONWizardError

Error raised when unable to create a class instance (most likely due to missing arguments)

property message: str

Format and return an error message.

static name(obj) str[source]

Return the type or class name of an object

exception dataclass_wizard.errors.ParseError(base_err: Exception, obj: Any, ann_type: Type | Iterable, _default_class: type | None = None, _field_name: str | None = None, _json_object: Any = None, **kwargs)[source]

Bases: JSONWizardError

Base error when an error occurs during the JSON load process.

property class_name: str | None
property field_name: str | None
property json_object
property message: str

Format and return an error message.

static name(obj) str[source]

Return the type or class name of an object

exception dataclass_wizard.errors.UnknownJSONKey(json_key: str, obj: Dict[str, Any], cls: Type, cls_fields: Tuple[Field], **kwargs)[source]

Bases: JSONWizardError

Error raised when an unknown JSON key is encountered in the JSON load process.

Note that this error class is only raised when the raise_on_unknown_json_key flag is enabled in the Meta class.

property message: str

Format and return an error message.

static name(obj) str[source]

Return the type or class name of an object

dataclass_wizard.lazy_imports module

Lazy Import definitions. Generally, these imports will be available when any “bonus features” are installed, i.e. as below:

$ pip install dataclass-wizard[timedelta]

dataclass_wizard.loaders module

class dataclass_wizard.loaders.LoadMixin[source]

Bases: AbstractLoader, BaseLoadHook

This Mixin class derives its name from the eponymous json.loads function. Essentially it contains helper methods to convert JSON strings (or a Python dictionary object) to a dataclass which can often contain complex types such as lists, dicts, or even other dataclasses nested within it.

Refer to the AbstractLoader class for documentation on any of the implemented methods.

static default_load_to(o: T, _: Any) T[source]

Default load function if no other paths match. Generally, this will be a stub load method.

classmethod get_parser_for_annotation(ann_type: Type[T], base_cls: Type = None, extras: Extras = None) AbstractParser[source]

Returns the Parser (dispatcher) for a given annotation type.

static load_after_type_check(o: Any, base_type: Type[T]) T[source]

Load an object o, after confirming that it is indeed of type base_type.

Raises:

ParseError – If the object is not of the expected type.

static load_to_bool(o: str | bool | N, _: Type[bool]) bool[source]

Load a bool, string, or an numeric value into a new object of type bool.

Note: bool cannot be sub-classed, so the base_type argument is discarded in this case.

static load_to_date(o: str | ~numbers.Number | ~datetime.date, base_type=<class 'datetime.date'>, default=None, raise_=True)

Attempt to convert an object o to a date object using the below logic.

  • str: convert date strings (in ISO format) via the built-in fromisoformat method.

  • Number (int or float): Convert a numeric timestamp via the

    built-in fromtimestamp method.

  • date: Return object o if it’s already of this type or

    sub-type.

Otherwise, if we’re unable to convert the value of o to a date as expected, raise an error if the raise_ parameter is true; if not, return default instead.

static load_to_datetime(o: str | ~numbers.Number | ~datetime.datetime, base_type=<class 'datetime.datetime'>, default=None, raise_=True)

Attempt to convert an object o to a datetime object using the below logic.

  • str: convert datetime strings (in ISO format) via the built-in fromisoformat method.

  • Number (int or float): Convert a numeric timestamp via the

    built-in fromtimestamp method.

  • datetime: Return object o if it’s already of this type or

    sub-type.

Otherwise, if we’re unable to convert the value of o to a datetime as expected, raise an error if the raise_ parameter is true; if not, return default instead.

static load_to_decimal(o: N, base_type: Type[Decimal]) Decimal[source]

Load an object o into a new object of type base_type (generally a Decimal or a sub-class of one)

static load_to_defaultdict(o: Dict, base_type: Type[DD], default_factory: Callable[[], T], key_parser: AbstractParser, val_parser: AbstractParser) DD[source]

Load an object o into a new object of type base_type (generally a collections.defaultdict or a sub-class of one)

static load_to_dict(o: Dict, base_type: Type[M], key_parser: AbstractParser, val_parser: AbstractParser) M[source]

Load an object o into a new object of type base_type (generally a dict or a sub-class of one)

static load_to_enum(o: AnyStr | N, base_type: Type[E]) E[source]

Load an object o into a new object of type base_type (generally a sub-class of the Enum type)

static load_to_float(o: SupportsFloat | str, base_type: Type[N]) N[source]

Load a string or float into a new object of type base_type (generally a sub-class of the float type)

static load_to_int(o: str | int | float | bool | None, base_type=<class 'int'>, default=0, raise_=True)

Return o if already a int, otherwise return the int value for a string. If o is None or an empty string, return default instead.

If o cannot be converted to an int, raise an error if raise_ is true, other return default instead.

Raises:
  • TypeError – If o is a bool (which is an int sub-class)

  • ValueError – When o cannot be converted to an int, and the raise_ parameter is true

static load_to_iterable(o: Iterable, base_type: Type[LSQ], elem_parser: AbstractParser) LSQ[source]

Load a list, set, frozenset or deque into a new object of type base_type (generally a list, set, frozenset, deque, or a sub-class of one)

static load_to_named_tuple(o: Dict | List | Tuple, base_type: Type[NT], field_to_parser: Dict[str, AbstractParser], field_parsers: List[AbstractParser]) NT[source]

Load a dictionary, list, or tuple to a NamedTuple sub-class

static load_to_named_tuple_untyped(o: Dict | List | Tuple, base_type: Type[NT], dict_parser: AbstractParser, list_parser: AbstractParser) NT[source]

Load a dictionary, list, or tuple to a (generally) un-typed collections.namedtuple

static load_to_str(o: str | None, base_type=<class 'str'>, raise_=True)

Return o if already a str, otherwise return the string value for o. If o is None or an empty string, return default instead.

If o cannot be converted to an str, raise an error if raise_ is true, other return default instead.

static load_to_time(o: str | ~datetime.time, base_type=<class 'datetime.time'>, default=None, raise_=True)

Attempt to convert an object o to a time object using the below logic.

  • str: convert time strings (in ISO format) via the built-in fromisoformat method.

  • time: Return object o if it’s already of this type or

    sub-type.

Otherwise, if we’re unable to convert the value of o to a time as expected, raise an error if the raise_ parameter is true; if not, return default instead.

static load_to_timedelta(o: str | ~dataclass_wizard.type_def.N | ~datetime.timedelta, base_type=<class 'datetime.timedelta'>, default=None, raise_=True)

Attempt to convert an object o to a timedelta object using the below logic.

  • str: If the string is in a numeric form like “1.23”, we convert it to a float and assume it’s in seconds. Otherwise, we convert strings via the pytimeparse.parse function.

  • int or float: A numeric value is assumed to be in seconds. In this case, it is passed in to the constructor like timedelta(seconds=...)

  • timedelta: Return object o if it’s already of this type or

    sub-type.

Otherwise, if we’re unable to convert the value of o to a timedelta as expected, raise an error if the raise_ parameter is true; if not, return default instead.

static load_to_tuple(o: List | Tuple, base_type: Type[Tuple], elem_parsers: Sequence[AbstractParser]) Tuple[source]

Load a list or tuple into a new object of type base_type (generally a tuple or a sub-class of one)

static load_to_typed_dict(o: Dict, base_type: Type[M], key_to_parser: Dict[str, AbstractParser], required_keys: FrozenSet[str], optional_keys: FrozenSet[str]) M[source]

Load an object o annotated as a TypedDict sub-class into a new object of type base_type (generally a dict or a sub-class of one)

static load_to_uuid(o: AnyStr | U, base_type: Type[U]) U[source]

Load an object o into a new object of type base_type (generally a sub-class of the UUID type)

static transform_json_field(string: str) str

Make an underscored, lowercase form from the expression in the string.

Example:

>>> to_snake_case("DeviceType")
'device_type'
dataclass_wizard.loaders.fromdict(cls: Type[T], d: Dict[str, Any]) T[source]

Converts a Python dictionary object to a dataclass instance.

Iterates over each dataclass field recursively; lists, dicts, and nested dataclasses will likewise be initialized as expected.

When directly invoking this function, an optional Meta configuration for the dataclass can be specified via LoadMeta; by default, this will apply recursively to any nested dataclasses. Here’s a sample usage of this below:

>>> LoadMeta(key_transform='CAMEL').bind_to(MyClass)
>>> fromdict(MyClass, {"myStr": "value"})
dataclass_wizard.loaders.fromlist(cls: Type[T], list_of_dict: List[Dict[str, Any]]) List[T][source]

Converts a Python list object to a list of dataclass instances.

Iterates over each dataclass field recursively; lists, dicts, and nested dataclasses will likewise be initialized as expected.

dataclass_wizard.loaders.get_loader(class_or_instance=None, create=True) Type[LoadMixin][source]

Get the loader for the class, using the following logic:

  • Return the class if it’s already a sub-class of LoadMixin

  • If create is enabled (which is the default), a new sub-class of LoadMixin for the class will be generated and cached on the initial run.

  • Otherwise, we will return the base loader, LoadMixin, which can potentially be shared by more than one dataclass.

dataclass_wizard.loaders.load_func_for_dataclass(cls: Type[T], is_main_class: bool = True, config: Type[M] | None = None) Callable[[Dict[str, Any]], T][source]
dataclass_wizard.loaders.setup_default_loader(cls=<class 'dataclass_wizard.loaders.LoadMixin'>)[source]

Setup the default type hooks to use when converting str (json) or a Python dict object to a dataclass instance.

Note: cls must be LoadMixIn or a sub-class of it.

dataclass_wizard.log module

dataclass_wizard.models module

class dataclass_wizard.models.Container(iterable=(), /)[source]

Bases: List[T]

Convenience wrapper around a collection of dataclass instances.

For all intents and purposes, this should behave exactly as a list object.

Usage:

>>> from dataclass_wizard import Container, fromlist
>>> from dataclasses import make_dataclass
>>>
>>> A = make_dataclass('A', [('f1', str), ('f2', int)])
>>> list_of_a = fromlist(A, [{'f1': 'hello', 'f2': 1}, {'f1': 'world', 'f2': 2}])
>>> c = Container[A](list_of_a)
>>> print(c.prettify())
prettify(encoder: ~dataclass_wizard.type_def.Encoder = <function dumps>, ensure_ascii=False, **encoder_kwargs) str[source]

Convert the list of instances to a prettified JSON string.

to_json(encoder: ~dataclass_wizard.type_def.Encoder = <function dumps>, **encoder_kwargs) str[source]

Convert the list of instances to a JSON string.

to_json_file(file: str, mode: str = 'w', encoder: ~dataclass_wizard.type_def.FileEncoder = <function dump>, **encoder_kwargs) None[source]

Serializes the list of instances and writes it to a JSON file.

class dataclass_wizard.models.DatePattern[source]

Bases: date, _PatternBase

An annotated type representing a date pattern (i.e. format string). Upon de-serialization, the resolved type will be a date instead.

See the docs on Pattern() for more info.

class dataclass_wizard.models.DateTimePattern[source]

Bases: datetime, _PatternBase

An annotated type representing a datetime pattern (i.e. format string). Upon de-serialization, the resolved type will be a datetime instead.

See the docs on Pattern() for more info.

class dataclass_wizard.models.Extras[source]

Bases: TypedDict

“Extra” config that can be used in the load / dump process.

config: Type[M]
pattern: _PatternedDT
class dataclass_wizard.models.JSON(*keys: str, all=False, dump=True)[source]

Bases: object

Represents one or more mappings of JSON keys.

See the docs on the json_key() function for more info.

all
dump
keys
class dataclass_wizard.models.JSONField(keys: str | Collection[str], all: bool, dump: bool, default, default_factory, init, repr, hash, compare, metadata)[source]

Bases: Field

Alias to a dataclasses.Field, but one which also represents a mapping of one or more JSON key names to a dataclass field.

See the docs on the json_field() function for more info.

json
dataclass_wizard.models.Pattern(pattern: str)[source]

Represents a pattern (i.e. format string) for a date / time / datetime type or subtype. For example, a custom pattern like below:

%d, %b, %Y %H:%M:%S.%f

A sample usage of Pattern, using a subclass of time:

time_field: Annotated[List[MyTime], Pattern('%I:%M %p')]
Parameters:

pattern – A format string to be passed in to datetime.strptime

class dataclass_wizard.models.TimePattern[source]

Bases: time, _PatternBase

An annotated type representing a time pattern (i.e. format string). Upon de-serialization, the resolved type will be a time instead.

See the docs on Pattern() for more info.

dataclass_wizard.models.json_field(keys: str | ~typing.Collection[str], *, all=False, dump=True, default=<dataclasses._MISSING_TYPE object>, default_factory=<dataclasses._MISSING_TYPE object>, init=True, repr=True, hash=None, compare=True, metadata=None)[source]

This is a helper function that sets the same defaults for keyword arguments as the dataclasses.field function. It can be thought of as an alias to dataclasses.field(...), but one which also represents a mapping of one or more JSON key names to a dataclass field.

This is only in addition to the default key transform; for example, a JSON key appearing as “myField”, “MyField” or “my-field” will already map to a dataclass field “my_field” by default (assuming the key transform converts to snake case).

The mapping to each JSON key name is case-sensitive, so passing “myfield” will not match a “myField” key in a JSON string or a Python dict object.

keys is a string, or a collection (list, tuple, etc.) of strings. It represents one of more JSON keys to associate with the dataclass field.

When all is passed as True (default is False), it will also associate the reverse mapping, i.e. from dataclass field to JSON key. If multiple JSON keys are passed in, it uses the first one provided in this case. This mapping is then used when to_dict or to_json is called, instead of the default key transform.

When dump is passed as False (default is True), this field will be skipped, or excluded, in the serialization process to JSON.

dataclass_wizard.models.json_key(*keys: str, all=False, dump=True)[source]

Represents a mapping of one or more JSON key names for a dataclass field.

This is only in addition to the default key transform; for example, a JSON key appearing as “myField”, “MyField” or “my-field” will already map to a dataclass field “my_field” by default (assuming the key transform converts to snake case).

The mapping to each JSON key name is case-sensitive, so passing “myfield” will not match a “myField” key in a JSON string or a Python dict object.

Parameters:
  • keys – A list of one of more JSON keys to associate with the dataclass field.

  • all – True to also associate the reverse mapping, i.e. from dataclass field to JSON key. If multiple JSON keys are passed in, it uses the first one provided in this case. This mapping is then used when to_dict or to_json is called, instead of the default key transform.

  • dump – False to skip this field in the serialization process to JSON. By default, this field and its value is included.

dataclass_wizard.parsers module

class dataclass_wizard.parsers.DefaultDictParser(cls: dataclasses.InitVar[Type], extras: dataclasses.InitVar[Extras], base_type: Type[DD], hook: Callable[[Any, Type[DD], Callable[[], T], dataclass_wizard.abstractions.AbstractParser, dataclass_wizard.abstractions.AbstractParser], DD], get_parser: dataclasses.InitVar[Callable[[Type[T], Type, dataclass_wizard.models.Extras], dataclass_wizard.abstractions.AbstractParser]])[source]

Bases: MappingParser

default_factory
class dataclass_wizard.parsers.IdentityParser(cls: dataclasses.InitVar[Type], extras: dataclasses.InitVar[Extras], base_type: Type[T])[source]

Bases: AbstractParser

cls: dataclasses.InitVar[Type]
extras: dataclasses.InitVar[Extras]
class dataclass_wizard.parsers.IterableParser(cls: dataclasses.InitVar[Type], extras: dataclasses.InitVar[Extras], base_type: Type[LSQ], hook: Callable[[Iterable, Type[LSQ], AbstractParser], LSQ], get_parser: dataclasses.InitVar[Callable[[Type[T], Type, dataclass_wizard.models.Extras], dataclass_wizard.abstractions.AbstractParser]])[source]

Bases: AbstractParser

Parser for a list, set, frozenset, deque, or a subclass of either type.

elem_parser
get_parser: dataclasses.InitVar[Callable[[Type[T], Type, dataclass_wizard.models.Extras], dataclass_wizard.abstractions.AbstractParser]]
hook: Callable[[Iterable, Type[LSQ], AbstractParser], LSQ]
class dataclass_wizard.parsers.LiteralParser(cls: dataclasses.InitVar[Type], extras: dataclasses.InitVar[Extras], base_type: Type[M])[source]

Bases: AbstractParser

value_to_type
class dataclass_wizard.parsers.MappingParser(cls: dataclasses.InitVar[Type], extras: dataclasses.InitVar[Extras], base_type: Type[M], hook: Callable[[Any, Type[M], dataclass_wizard.abstractions.AbstractParser, dataclass_wizard.abstractions.AbstractParser], M], get_parser: dataclasses.InitVar[Callable[[Type[T], Type, dataclass_wizard.models.Extras], dataclass_wizard.abstractions.AbstractParser]])[source]

Bases: AbstractParser

get_parser: dataclasses.InitVar[Callable[[Type[T], Type, dataclass_wizard.models.Extras], dataclass_wizard.abstractions.AbstractParser]]
hook: Callable[[Any, Type[M], AbstractParser, AbstractParser], M]
key_parser
val_parser
class dataclass_wizard.parsers.NamedTupleParser(cls: dataclasses.InitVar[Type], extras: dataclasses.InitVar[Extras], base_type: Type[S], hook: Callable[[Any, Type[~NT], Optional[Dict[str, ForwardRef('AbstractParser')]], List[dataclass_wizard.abstractions.AbstractParser]], ~NT], get_parser: dataclasses.InitVar[Callable[[Type[T], Type, dataclass_wizard.models.Extras], dataclass_wizard.abstractions.AbstractParser]])[source]

Bases: AbstractParser

field_parsers
field_to_parser
get_parser: dataclasses.InitVar[Callable[[Type[T], Type, dataclass_wizard.models.Extras], dataclass_wizard.abstractions.AbstractParser]]
hook: Callable[[Any, Type[NT], Dict[str, AbstractParser] | None, List[AbstractParser]], NT]
class dataclass_wizard.parsers.NamedTupleUntypedParser(cls: dataclasses.InitVar[Type], extras: dataclasses.InitVar[Extras], base_type: Type[S], hook: Callable[[Any, Type[NT], dataclass_wizard.abstractions.AbstractParser, dataclass_wizard.abstractions.AbstractParser], NT], get_parser: dataclasses.InitVar[Callable[[Type[T], Type, dataclass_wizard.models.Extras], dataclass_wizard.abstractions.AbstractParser]])[source]

Bases: AbstractParser

dict_parser
get_parser: dataclasses.InitVar[Callable[[Type[T], Type, dataclass_wizard.models.Extras], dataclass_wizard.abstractions.AbstractParser]]
hook: Callable[[Any, Type[NT], AbstractParser, AbstractParser], NT]
list_parser
class dataclass_wizard.parsers.OptionalParser(cls: dataclasses.InitVar[Type], extras: dataclasses.InitVar[Extras], base_type: Type[T], get_parser: dataclasses.InitVar[Callable[[Type[T], Type, dataclass_wizard.models.Extras], dataclass_wizard.abstractions.AbstractParser]])[source]

Bases: AbstractParser

get_parser: dataclasses.InitVar[Callable[[Type[T], Type, dataclass_wizard.models.Extras], dataclass_wizard.abstractions.AbstractParser]]
parser
class dataclass_wizard.parsers.Parser(cls: dataclasses.InitVar[Type], extras: dataclasses.InitVar[Extras], base_type: Type[T], hook: Callable[[Any, Type[T]], T])[source]

Bases: AbstractParser

hook: Callable[[Any, Type[T]], T]
class dataclass_wizard.parsers.PatternedDTParser(cls: dataclasses.InitVar[Type], extras: dataclasses.InitVar[Extras], base_type: dataclass_wizard.models._PatternedDT)[source]

Bases: AbstractParser

hook
class dataclass_wizard.parsers.SingleArgParser(cls: dataclasses.InitVar[Type], extras: dataclasses.InitVar[Extras], base_type: Type[T], hook: Callable[[Any], T])[source]

Bases: AbstractParser

hook: Callable[[Any], T]
class dataclass_wizard.parsers.TupleParser(cls: dataclasses.InitVar[Type], extras: dataclasses.InitVar[Extras], base_type: Type[S], hook: Callable[[Any, Type[S], Tuple[AbstractParser, ...]], S], get_parser: dataclasses.InitVar[Callable[[Type[T], Type, dataclass_wizard.models.Extras], dataclass_wizard.abstractions.AbstractParser]])[source]

Bases: AbstractParser

Parser for subscripted and un-subscripted Tuple’s.

See VariadicTupleParser for the parser that handles the variadic form, i.e. Tuple[str, ...]

elem_parsers
get_parser: dataclasses.InitVar[Callable[[Type[T], Type, dataclass_wizard.models.Extras], dataclass_wizard.abstractions.AbstractParser]]
hook: Callable[[Any, Type[S], Tuple[AbstractParser, ...]], S]
required_count
total_count
class dataclass_wizard.parsers.TypedDictParser(cls: dataclasses.InitVar[Type], extras: dataclasses.InitVar[Extras], base_type: Type[S], hook: Callable[[Any, Type[~M], Dict[str, ForwardRef('AbstractParser')], FrozenSet[str], FrozenSet[str]], ~M], get_parser: dataclasses.InitVar[Callable[[Type[T], Type, dataclass_wizard.models.Extras], dataclass_wizard.abstractions.AbstractParser]])[source]

Bases: AbstractParser

get_parser: dataclasses.InitVar[Callable[[Type[T], Type, dataclass_wizard.models.Extras], dataclass_wizard.abstractions.AbstractParser]]
hook: Callable[[Any, Type[M], Dict[str, AbstractParser], FrozenSet[str], FrozenSet[str]], M]
key_to_parser
optional_keys
required_keys
class dataclass_wizard.parsers.UnionParser(cls: dataclasses.InitVar[Type], extras: dataclasses.InitVar[Extras], base_type: Tuple[Type[T], ...], get_parser: dataclasses.InitVar[Callable[[Type[T], Type, dataclass_wizard.models.Extras], dataclass_wizard.abstractions.AbstractParser]])[source]

Bases: AbstractParser

get_parser: dataclasses.InitVar[Callable[[Type[T], Type, dataclass_wizard.models.Extras], dataclass_wizard.abstractions.AbstractParser]]
parsers
tag_key
tag_to_parser
class dataclass_wizard.parsers.VariadicTupleParser(cls: dataclasses.InitVar[Type], extras: dataclasses.InitVar[Extras], base_type: Type[S], hook: Callable[[Any, Type[S], Tuple[AbstractParser, ...]], S], get_parser: dataclasses.InitVar[Callable[[Type[T], Type, dataclass_wizard.models.Extras], dataclass_wizard.abstractions.AbstractParser]])[source]

Bases: TupleParser

Parser that handles the variadic form of Tuple’s, i.e. Tuple[str, ...]

Per PEP 484, only one required type is allowed before the Ellipsis. That is, Tuple[int, ...] is valid whereas Tuple[int, str, ...] would be invalid. See here for more info.

cls: dataclasses.InitVar[Type]
extras: dataclasses.InitVar[Extras]
first_elem_parser
get_parser: dataclasses.InitVar[Callable[[Type[T], Type, dataclass_wizard.models.Extras], dataclass_wizard.abstractions.AbstractParser]]

dataclass_wizard.property_wizard module

dataclass_wizard.property_wizard.property_wizard(*args, **kwargs)[source]

Adds support for field properties with default values in dataclasses.

For examples of usage, please see the Using Field Properties section in the docs. I also added an answer on a SO article that deals with using such properties in dataclasses.

dataclass_wizard.serial_json module

class dataclass_wizard.serial_json.JSONSerializable[source]

Bases: AbstractJSONWizard

Mixin class to allow a dataclass sub-class to be easily converted to and from JSON.

class Meta[source]

Bases: BaseJSONWizardMeta

Inner meta class that can be extended by sub-classes for additional customization with the JSON load / dump process.

classmethod from_dict(d: Dict[str, Any]) T

Converts a Python dictionary object to a dataclass instance.

Iterates over each dataclass field recursively; lists, dicts, and nested dataclasses will likewise be initialized as expected.

When directly invoking this function, an optional Meta configuration for the dataclass can be specified via LoadMeta; by default, this will apply recursively to any nested dataclasses. Here’s a sample usage of this below:

>>> LoadMeta(key_transform='CAMEL').bind_to(MyClass)
>>> fromdict(MyClass, {"myStr": "value"})
classmethod from_json(string: AnyStr, *, decoder: ~dataclass_wizard.type_def.Decoder = <function loads>, **decoder_kwargs) W | List[W][source]

Converts a JSON string to an instance of the dataclass, or a list of the dataclass instances.

classmethod from_list(list_of_dict: List[Dict[str, Any]]) List[T]

Converts a Python list object to a list of dataclass instances.

Iterates over each dataclass field recursively; lists, dicts, and nested dataclasses will likewise be initialized as expected.

classmethod list_to_json(instances: ~typing.List[~dataclass_wizard.abstractions.W], encoder: ~dataclass_wizard.type_def.Encoder = <function dumps>, **encoder_kwargs) AnyStr[source]

Converts a list of dataclass instances to a JSON string representation.

to_dict(*, cls=None, dict_factory=<class 'dict'>, exclude: ~typing.List[str] = None, **kwargs) Dict[str, Any]

Return the fields of a dataclass instance as a new dictionary mapping field names to field values.

Example usage:

@dataclass class C:

x: int y: int

c = C(1, 2) assert asdict(c) == {‘x’: 1, ‘y’: 2}

When directly invoking this function, an optional Meta configuration for the dataclass can be specified via DumpMeta; by default, this will apply recursively to any nested dataclasses. Here’s a sample usage of this below:

>>> DumpMeta(key_transform='CAMEL').bind_to(MyClass)
>>> asdict(MyClass(my_str="value"))

If given, ‘dict_factory’ will be used instead of built-in dict. The function applies recursively to field values that are dataclass instances. This will also look into built-in containers: tuples, lists, and dicts.

to_json(*, encoder: ~dataclass_wizard.type_def.Encoder = <function dumps>, **encoder_kwargs) AnyStr[source]

Converts the dataclass instance to a JSON string representation.

dataclass_wizard.type_def module

class dataclass_wizard.type_def.Decoder(*args, **kwargs)[source]

Bases: Protocol

Represents a decoder for JSON -> Python object, e.g. analogous to json.loads

class dataclass_wizard.type_def.Encoder(*args, **kwargs)[source]

Bases: Protocol

Represents an encoder for Python object -> JSON, e.g. analogous to json.dumps

class dataclass_wizard.type_def.ExplicitNullType[source]

Bases: object

class dataclass_wizard.type_def.FileDecoder(*args, **kwargs)[source]

Bases: Protocol

Represents a decoder for JSON file -> Python object, e.g. analogous to json.load

class dataclass_wizard.type_def.FileEncoder(*args, **kwargs)[source]

Bases: Protocol

Represents an encoder for Python object -> JSON file, e.g. analogous to json.dump

class dataclass_wizard.type_def.NoneType

Bases: object

dataclass_wizard.type_def.PyForwardRef

alias of ForwardRef

dataclass_wizard.type_def.PyProtocol

alias of Protocol

dataclass_wizard.type_def.PyTypedDict(typename, fields=None, /, *, total=True, **kwargs)

A simple typed namespace. At runtime it is equivalent to a plain dict.

TypedDict creates a dictionary type such that a type checker will expect all instances to have a certain set of keys, where each key is associated with a value of a consistent type. This expectation is not checked at runtime.

Usage:

class Point2D(TypedDict):
    x: int
    y: int
    label: str

a: Point2D = {'x': 1, 'y': 2, 'label': 'good'}  # OK
b: Point2D = {'z': 3, 'label': 'bad'}           # Fails type check

assert Point2D(x=1, y=2, label='first') == dict(x=1, y=2, label='first')

The type info can be accessed via the Point2D.__annotations__ dict, and the Point2D.__required_keys__ and Point2D.__optional_keys__ frozensets. TypedDict supports an additional equivalent form:

Point2D = TypedDict('Point2D', {'x': int, 'y': int, 'label': str})

By default, all keys must be present in a TypedDict. It is possible to override this by specifying totality:

class Point2D(TypedDict, total=False):
    x: int
    y: int

This means that a Point2D TypedDict can have any of the keys omitted. A type checker is only expected to support a literal False or True as the value of the total argument. True is the default, and makes all items defined in the class body be required.

The Required and NotRequired special forms can also be used to mark individual keys as being required or not required:

class Point2D(TypedDict):
    x: int               # the "x" key must always be present (Required is the default)
    y: NotRequired[int]  # the "y" key can be omitted

See PEP 655 for more details on Required and NotRequired.

dataclass_wizard.wizard_mixins module

Helper Wizard Mixin classes.

class dataclass_wizard.wizard_mixins.JSONFileWizard[source]

Bases: object

A Mixin class that makes it easier to interact with JSON files.

This can be paired with the JSONSerializable (JSONWizard) Mixin class for more complete extensibility.

classmethod from_json_file(file: str, *, decoder: ~dataclass_wizard.type_def.FileDecoder = <function load>, **decoder_kwargs) T | List[T][source]

Reads in the JSON file contents and converts to an instance of the dataclass, or a list of the dataclass instances.

to_json_file(file: str, mode: str = 'w', encoder: ~dataclass_wizard.type_def.FileEncoder = <function dump>, **encoder_kwargs) None[source]

Serializes the instance and writes it to a JSON file.

class dataclass_wizard.wizard_mixins.JSONListWizard[source]

Bases: JSONSerializable

A Mixin class that extends JSONSerializable (JSONWizard) to return Container - instead of list - objects.

Note that Container objects are simply convenience wrappers around a collection of dataclass instances. For all intents and purposes, they behave exactly the same as list objects, with some added helper methods:

  • prettify - Convert the list of instances to a prettified JSON string.

  • to_json - Convert the list of instances to a JSON string.

  • to_json_file - Serialize the list of instances and write it to a JSON file.

classmethod from_json(string: AnyStr, *, decoder: ~dataclass_wizard.type_def.Decoder = <function loads>, **decoder_kwargs) W | Container[W][source]

Converts a JSON string to an instance of the dataclass, or a Container (list) of the dataclass instances.

classmethod from_list(o: List[Dict[str, Any]]) Container[W][source]

Converts a Python list object to a Container (list) of the dataclass instances.

class dataclass_wizard.wizard_mixins.YAMLWizard[source]

Bases: object

A Mixin class that makes it easier to interact with YAML data.

Note

The default key transform used in the YAML dump process is lisp-case, however this can easily be customized without the need to sub-class from JSONWizard.

For example:

>>> @dataclass
>>> class MyClass(YAMLWizard, key_transform='CAMEL'):
>>>     ...
classmethod from_yaml(string_or_stream: AnyStr | TextIO | BinaryIO, *, decoder: Decoder | None = None, **decoder_kwargs) T | List[T][source]

Converts a YAML string to an instance of the dataclass, or a list of the dataclass instances.

classmethod from_yaml_file(file: str, *, decoder: FileDecoder | None = None, **decoder_kwargs) T | List[T][source]

Reads in the YAML file contents and converts to an instance of the dataclass, or a list of the dataclass instances.

classmethod list_to_yaml(instances: List[T], encoder: Encoder | None = None, **encoder_kwargs) AnyStr[source]

Converts a list of dataclass instances to a YAML string representation.

to_yaml(*, encoder: Encoder | None = None, **encoder_kwargs) AnyStr[source]

Converts the dataclass instance to a YAML string representation.

to_yaml_file(file: str, mode: str = 'w', encoder: FileEncoder | None = None, **encoder_kwargs) None[source]

Serializes the instance and writes it to a YAML file.

Module contents

Dataclass Wizard

Marshal dataclasses to/from JSON and Python dict objects. Support properties with initial values. Generate a dataclass schema for JSON input.

Sample Usage:

>>> from dataclasses import dataclass, field
>>> from datetime import datetime
>>> from typing import Optional, List
>>>
>>> from dataclass_wizard import JSONSerializable, property_wizard
>>>
>>>
>>> @dataclass
>>> class MyClass(JSONSerializable, metaclass=property_wizard):
>>>
>>>     my_str: Optional[str]
>>>     list_of_int: List[int] = field(default_factory=list)
>>>     # You can also define this as `my_dt`, however only the annotation
>>>     # will carry over in that case, since the value is re-declared by
>>>     # the property below.
>>>     _my_dt: datetime = datetime(2000, 1, 1)
>>>
>>>     @property
>>>     def my_dt(self):
>>>     # A sample `getter` which returns the datetime with year set as 2010
>>>         if self._my_dt is not None:
>>>             return self._my_dt.replace(year=2010)
>>>         return self._my_dt
>>>
>>>     @my_dt.setter
>>>     def my_dt(self, new_dt: datetime):
>>>     # A sample `setter` which sets the inverse (roughly) of the `month` and `day`
>>>         self._my_dt = new_dt.replace(month=13 - new_dt.month,
>>>                                      day=30 - new_dt.day)
>>>
>>>
>>> string = '''{"myStr": 42, "listOFInt": [1, "2", 3]}'''
>>> c = MyClass.from_json(string)
>>> print(repr(c))
>>> # prints:
>>> #   MyClass(
>>> #       my_str='42',
>>> #       list_of_int=[1, 2, 3],
>>> #       my_dt=datetime.datetime(2010, 12, 29, 0, 0)
>>> #   )
>>> my_dict = {'My_Str': 'string', 'myDT': '2021-01-20T15:55:30Z'}
>>> c = MyClass.from_dict(my_dict)
>>> print(repr(c))
>>> # prints:
>>> #   MyClass(
>>> #       my_str='string',
>>> #       list_of_int=[],
>>> #       my_dt=datetime.datetime(2010, 12, 10, 15, 55, 30,
>>> #                               tzinfo=datetime.timezone.utc)
>>> #   )
>>> print(c.to_json())
>>> # prints:
>>> #   {"myStr": "string", "listOfInt": [], "myDt": "2010-12-10T15:55:30Z"}

For full documentation and more advanced usage, please see <https://dataclass-wizard.readthedocs.io>.

copyright:
  1. 2021 by Ritvik Nag.

license:

Apache 2.0, see LICENSE for more details.

class dataclass_wizard.Container(iterable=(), /)[source]

Bases: List[T]

Convenience wrapper around a collection of dataclass instances.

For all intents and purposes, this should behave exactly as a list object.

Usage:

>>> from dataclass_wizard import Container, fromlist
>>> from dataclasses import make_dataclass
>>>
>>> A = make_dataclass('A', [('f1', str), ('f2', int)])
>>> list_of_a = fromlist(A, [{'f1': 'hello', 'f2': 1}, {'f1': 'world', 'f2': 2}])
>>> c = Container[A](list_of_a)
>>> print(c.prettify())
prettify(encoder: ~dataclass_wizard.type_def.Encoder = <function dumps>, ensure_ascii=False, **encoder_kwargs) str[source]

Convert the list of instances to a prettified JSON string.

to_json(encoder: ~dataclass_wizard.type_def.Encoder = <function dumps>, **encoder_kwargs) str[source]

Convert the list of instances to a JSON string.

to_json_file(file: str, mode: str = 'w', encoder: ~dataclass_wizard.type_def.FileEncoder = <function dump>, **encoder_kwargs) None[source]

Serializes the list of instances and writes it to a JSON file.

class dataclass_wizard.DatePattern[source]

Bases: date, _PatternBase

An annotated type representing a date pattern (i.e. format string). Upon de-serialization, the resolved type will be a date instead.

See the docs on Pattern() for more info.

class dataclass_wizard.DateTimePattern[source]

Bases: datetime, _PatternBase

An annotated type representing a datetime pattern (i.e. format string). Upon de-serialization, the resolved type will be a datetime instead.

See the docs on Pattern() for more info.

dataclass_wizard.DumpMeta(*, debug_enabled: bool = False, recursive: bool = True, marshal_date_time_as: DateTimeTo | str = None, key_transform: LetterCase | str = None, tag: str = None, skip_defaults: bool = False) Type[M][source]

Helper function to setup the Meta Config for the JSON dump (serialization) process, which is intended for use alongside the asdict helper function.

For descriptions on what each of these params does, refer to the Docs below, or check out the AbstractMeta definition (I want to avoid duplicating the descriptions for params here).

Examples:

>>> DumpMeta(key_transform='CAMEL').bind_to(MyClass)
>>> asdict(MyClass, {"myStr": "value"})
class dataclass_wizard.DumpMixin[source]

Bases: AbstractDumper, BaseDumpHook

This Mixin class derives its name from the eponymous json.dumps function. Essentially it contains helper methods to convert Python built-in types to a more ‘JSON-friendly’ version.

static default_dump_with(o, *_)[source]
static dump_with_bool(o: bool, *_)[source]
static dump_with_date(o: date, *_)[source]
static dump_with_datetime(o: datetime, *_)[source]
static dump_with_decimal(o: Decimal, *_)[source]
static dump_with_defaultdict(o: DD, _typ: Type[DD], *args)[source]
static dump_with_dict(o: Dict, typ: Type[Dict], *args)[source]
static dump_with_enum(o: E, *_)[source]
static dump_with_float(o: float, *_)[source]
static dump_with_int(o: int, *_)[source]
static dump_with_iterable(o: LSQ, _typ: Type[LSQ], *args)[source]
static dump_with_list_or_tuple(o: LT, typ: Type[LT], *args)[source]
static dump_with_named_tuple(o: NT, typ: Type[NT], *args)[source]
static dump_with_null(o: None, *_)[source]
static dump_with_str(o: str, *_)[source]
static dump_with_time(o: time, *_)[source]
static dump_with_timedelta(o: timedelta, *_)[source]
static dump_with_uuid(o: U, *_)[source]
static transform_dataclass_field(string: str) str

Convert a string to Camel Case.

Examples:

>>> to_camel_case("device_type")
'deviceType'
class dataclass_wizard.JSONFileWizard[source]

Bases: object

A Mixin class that makes it easier to interact with JSON files.

This can be paired with the JSONSerializable (JSONWizard) Mixin class for more complete extensibility.

classmethod from_json_file(file: str, *, decoder: ~dataclass_wizard.type_def.FileDecoder = <function load>, **decoder_kwargs) T | List[T][source]

Reads in the JSON file contents and converts to an instance of the dataclass, or a list of the dataclass instances.

to_json_file(file: str, mode: str = 'w', encoder: ~dataclass_wizard.type_def.FileEncoder = <function dump>, **encoder_kwargs) None[source]

Serializes the instance and writes it to a JSON file.

class dataclass_wizard.JSONListWizard[source]

Bases: JSONSerializable

A Mixin class that extends JSONSerializable (JSONWizard) to return Container - instead of list - objects.

Note that Container objects are simply convenience wrappers around a collection of dataclass instances. For all intents and purposes, they behave exactly the same as list objects, with some added helper methods:

  • prettify - Convert the list of instances to a prettified JSON string.

  • to_json - Convert the list of instances to a JSON string.

  • to_json_file - Serialize the list of instances and write it to a JSON file.

classmethod from_json(string: AnyStr, *, decoder: ~dataclass_wizard.type_def.Decoder = <function loads>, **decoder_kwargs) W | Container[W][source]

Converts a JSON string to an instance of the dataclass, or a Container (list) of the dataclass instances.

classmethod from_list(o: List[Dict[str, Any]]) Container[W][source]

Converts a Python list object to a Container (list) of the dataclass instances.

class dataclass_wizard.JSONSerializable[source]

Bases: AbstractJSONWizard

Mixin class to allow a dataclass sub-class to be easily converted to and from JSON.

class Meta[source]

Bases: BaseJSONWizardMeta

Inner meta class that can be extended by sub-classes for additional customization with the JSON load / dump process.

classmethod from_dict(d: Dict[str, Any]) T

Converts a Python dictionary object to a dataclass instance.

Iterates over each dataclass field recursively; lists, dicts, and nested dataclasses will likewise be initialized as expected.

When directly invoking this function, an optional Meta configuration for the dataclass can be specified via LoadMeta; by default, this will apply recursively to any nested dataclasses. Here’s a sample usage of this below:

>>> LoadMeta(key_transform='CAMEL').bind_to(MyClass)
>>> fromdict(MyClass, {"myStr": "value"})
classmethod from_json(string: AnyStr, *, decoder: ~dataclass_wizard.type_def.Decoder = <function loads>, **decoder_kwargs) W | List[W][source]

Converts a JSON string to an instance of the dataclass, or a list of the dataclass instances.

classmethod from_list(list_of_dict: List[Dict[str, Any]]) List[T]

Converts a Python list object to a list of dataclass instances.

Iterates over each dataclass field recursively; lists, dicts, and nested dataclasses will likewise be initialized as expected.

classmethod list_to_json(instances: ~typing.List[~dataclass_wizard.abstractions.W], encoder: ~dataclass_wizard.type_def.Encoder = <function dumps>, **encoder_kwargs) AnyStr[source]

Converts a list of dataclass instances to a JSON string representation.

to_dict(*, cls=None, dict_factory=<class 'dict'>, exclude: ~typing.List[str] = None, **kwargs) Dict[str, Any]

Return the fields of a dataclass instance as a new dictionary mapping field names to field values.

Example usage:

@dataclass class C:

x: int y: int

c = C(1, 2) assert asdict(c) == {‘x’: 1, ‘y’: 2}

When directly invoking this function, an optional Meta configuration for the dataclass can be specified via DumpMeta; by default, this will apply recursively to any nested dataclasses. Here’s a sample usage of this below:

>>> DumpMeta(key_transform='CAMEL').bind_to(MyClass)
>>> asdict(MyClass(my_str="value"))

If given, ‘dict_factory’ will be used instead of built-in dict. The function applies recursively to field values that are dataclass instances. This will also look into built-in containers: tuples, lists, and dicts.

to_json(*, encoder: ~dataclass_wizard.type_def.Encoder = <function dumps>, **encoder_kwargs) AnyStr[source]

Converts the dataclass instance to a JSON string representation.

dataclass_wizard.JSONWizard

alias of JSONSerializable

dataclass_wizard.LoadMeta(*, debug_enabled: bool = False, recursive: bool = True, raise_on_unknown_json_key: bool = False, json_key_to_field: Dict[str, str] = None, key_transform: LetterCase | str = None, tag: str = None) Type[M][source]

Helper function to setup the Meta Config for the JSON load (de-serialization) process, which is intended for use alongside the fromdict helper function.

For descriptions on what each of these params does, refer to the Docs below, or check out the AbstractMeta definition (I want to avoid duplicating the descriptions for params here).

Examples:

>>> LoadMeta(key_transform='CAMEL').bind_to(MyClass)
>>> fromdict(MyClass, {"myStr": "value"})
class dataclass_wizard.LoadMixin[source]

Bases: AbstractLoader, BaseLoadHook

This Mixin class derives its name from the eponymous json.loads function. Essentially it contains helper methods to convert JSON strings (or a Python dictionary object) to a dataclass which can often contain complex types such as lists, dicts, or even other dataclasses nested within it.

Refer to the AbstractLoader class for documentation on any of the implemented methods.

static default_load_to(o: T, _: Any) T[source]

Default load function if no other paths match. Generally, this will be a stub load method.

classmethod get_parser_for_annotation(ann_type: Type[T], base_cls: Type = None, extras: Extras = None) AbstractParser[source]

Returns the Parser (dispatcher) for a given annotation type.

static load_after_type_check(o: Any, base_type: Type[T]) T[source]

Load an object o, after confirming that it is indeed of type base_type.

Raises:

ParseError – If the object is not of the expected type.

static load_to_bool(o: str | bool | N, _: Type[bool]) bool[source]

Load a bool, string, or an numeric value into a new object of type bool.

Note: bool cannot be sub-classed, so the base_type argument is discarded in this case.

static load_to_date(o: str | ~numbers.Number | ~datetime.date, base_type=<class 'datetime.date'>, default=None, raise_=True)

Attempt to convert an object o to a date object using the below logic.

  • str: convert date strings (in ISO format) via the built-in fromisoformat method.

  • Number (int or float): Convert a numeric timestamp via the

    built-in fromtimestamp method.

  • date: Return object o if it’s already of this type or

    sub-type.

Otherwise, if we’re unable to convert the value of o to a date as expected, raise an error if the raise_ parameter is true; if not, return default instead.

static load_to_datetime(o: str | ~numbers.Number | ~datetime.datetime, base_type=<class 'datetime.datetime'>, default=None, raise_=True)

Attempt to convert an object o to a datetime object using the below logic.

  • str: convert datetime strings (in ISO format) via the built-in fromisoformat method.

  • Number (int or float): Convert a numeric timestamp via the

    built-in fromtimestamp method.

  • datetime: Return object o if it’s already of this type or

    sub-type.

Otherwise, if we’re unable to convert the value of o to a datetime as expected, raise an error if the raise_ parameter is true; if not, return default instead.

static load_to_decimal(o: N, base_type: Type[Decimal]) Decimal[source]

Load an object o into a new object of type base_type (generally a Decimal or a sub-class of one)

static load_to_defaultdict(o: Dict, base_type: Type[DD], default_factory: Callable[[], T], key_parser: AbstractParser, val_parser: AbstractParser) DD[source]

Load an object o into a new object of type base_type (generally a collections.defaultdict or a sub-class of one)

static load_to_dict(o: Dict, base_type: Type[M], key_parser: AbstractParser, val_parser: AbstractParser) M[source]

Load an object o into a new object of type base_type (generally a dict or a sub-class of one)

static load_to_enum(o: AnyStr | N, base_type: Type[E]) E[source]

Load an object o into a new object of type base_type (generally a sub-class of the Enum type)

static load_to_float(o: SupportsFloat | str, base_type: Type[N]) N[source]

Load a string or float into a new object of type base_type (generally a sub-class of the float type)

static load_to_int(o: str | int | float | bool | None, base_type=<class 'int'>, default=0, raise_=True)

Return o if already a int, otherwise return the int value for a string. If o is None or an empty string, return default instead.

If o cannot be converted to an int, raise an error if raise_ is true, other return default instead.

Raises:
  • TypeError – If o is a bool (which is an int sub-class)

  • ValueError – When o cannot be converted to an int, and the raise_ parameter is true

static load_to_iterable(o: Iterable, base_type: Type[LSQ], elem_parser: AbstractParser) LSQ[source]

Load a list, set, frozenset or deque into a new object of type base_type (generally a list, set, frozenset, deque, or a sub-class of one)

static load_to_named_tuple(o: Dict | List | Tuple, base_type: Type[NT], field_to_parser: Dict[str, AbstractParser], field_parsers: List[AbstractParser]) NT[source]

Load a dictionary, list, or tuple to a NamedTuple sub-class

static load_to_named_tuple_untyped(o: Dict | List | Tuple, base_type: Type[NT], dict_parser: AbstractParser, list_parser: AbstractParser) NT[source]

Load a dictionary, list, or tuple to a (generally) un-typed collections.namedtuple

static load_to_str(o: str | None, base_type=<class 'str'>, raise_=True)

Return o if already a str, otherwise return the string value for o. If o is None or an empty string, return default instead.

If o cannot be converted to an str, raise an error if raise_ is true, other return default instead.

static load_to_time(o: str | ~datetime.time, base_type=<class 'datetime.time'>, default=None, raise_=True)

Attempt to convert an object o to a time object using the below logic.

  • str: convert time strings (in ISO format) via the built-in fromisoformat method.

  • time: Return object o if it’s already of this type or

    sub-type.

Otherwise, if we’re unable to convert the value of o to a time as expected, raise an error if the raise_ parameter is true; if not, return default instead.

static load_to_timedelta(o: str | ~dataclass_wizard.type_def.N | ~datetime.timedelta, base_type=<class 'datetime.timedelta'>, default=None, raise_=True)

Attempt to convert an object o to a timedelta object using the below logic.

  • str: If the string is in a numeric form like “1.23”, we convert it to a float and assume it’s in seconds. Otherwise, we convert strings via the pytimeparse.parse function.

  • int or float: A numeric value is assumed to be in seconds. In this case, it is passed in to the constructor like timedelta(seconds=...)

  • timedelta: Return object o if it’s already of this type or

    sub-type.

Otherwise, if we’re unable to convert the value of o to a timedelta as expected, raise an error if the raise_ parameter is true; if not, return default instead.

static load_to_tuple(o: List | Tuple, base_type: Type[Tuple], elem_parsers: Sequence[AbstractParser]) Tuple[source]

Load a list or tuple into a new object of type base_type (generally a tuple or a sub-class of one)

static load_to_typed_dict(o: Dict, base_type: Type[M], key_to_parser: Dict[str, AbstractParser], required_keys: FrozenSet[str], optional_keys: FrozenSet[str]) M[source]

Load an object o annotated as a TypedDict sub-class into a new object of type base_type (generally a dict or a sub-class of one)

static load_to_uuid(o: AnyStr | U, base_type: Type[U]) U[source]

Load an object o into a new object of type base_type (generally a sub-class of the UUID type)

static transform_json_field(string: str) str

Make an underscored, lowercase form from the expression in the string.

Example:

>>> to_snake_case("DeviceType")
'device_type'
dataclass_wizard.Pattern(pattern: str)[source]

Represents a pattern (i.e. format string) for a date / time / datetime type or subtype. For example, a custom pattern like below:

%d, %b, %Y %H:%M:%S.%f

A sample usage of Pattern, using a subclass of time:

time_field: Annotated[List[MyTime], Pattern('%I:%M %p')]
Parameters:

pattern – A format string to be passed in to datetime.strptime

class dataclass_wizard.TimePattern[source]

Bases: time, _PatternBase

An annotated type representing a time pattern (i.e. format string). Upon de-serialization, the resolved type will be a time instead.

See the docs on Pattern() for more info.

class dataclass_wizard.YAMLWizard[source]

Bases: object

A Mixin class that makes it easier to interact with YAML data.

Note

The default key transform used in the YAML dump process is lisp-case, however this can easily be customized without the need to sub-class from JSONWizard.

For example:

>>> @dataclass
>>> class MyClass(YAMLWizard, key_transform='CAMEL'):
>>>     ...
classmethod from_yaml(string_or_stream: AnyStr | TextIO | BinaryIO, *, decoder: Decoder | None = None, **decoder_kwargs) T | List[T][source]

Converts a YAML string to an instance of the dataclass, or a list of the dataclass instances.

classmethod from_yaml_file(file: str, *, decoder: FileDecoder | None = None, **decoder_kwargs) T | List[T][source]

Reads in the YAML file contents and converts to an instance of the dataclass, or a list of the dataclass instances.

classmethod list_to_yaml(instances: List[T], encoder: Encoder | None = None, **encoder_kwargs) AnyStr[source]

Converts a list of dataclass instances to a YAML string representation.

to_yaml(*, encoder: Encoder | None = None, **encoder_kwargs) AnyStr[source]

Converts the dataclass instance to a YAML string representation.

to_yaml_file(file: str, mode: str = 'w', encoder: FileEncoder | None = None, **encoder_kwargs) None[source]

Serializes the instance and writes it to a YAML file.

dataclass_wizard.asdict(obj: ~dataclass_wizard.type_def.T, *, cls=None, dict_factory=<class 'dict'>, exclude: ~typing.List[str] = None, **kwargs) Dict[str, Any][source]

Return the fields of a dataclass instance as a new dictionary mapping field names to field values.

Example usage:

@dataclass class C:

x: int y: int

c = C(1, 2) assert asdict(c) == {‘x’: 1, ‘y’: 2}

When directly invoking this function, an optional Meta configuration for the dataclass can be specified via DumpMeta; by default, this will apply recursively to any nested dataclasses. Here’s a sample usage of this below:

>>> DumpMeta(key_transform='CAMEL').bind_to(MyClass)
>>> asdict(MyClass(my_str="value"))

If given, ‘dict_factory’ will be used instead of built-in dict. The function applies recursively to field values that are dataclass instances. This will also look into built-in containers: tuples, lists, and dicts.

dataclass_wizard.fromdict(cls: Type[T], d: Dict[str, Any]) T[source]

Converts a Python dictionary object to a dataclass instance.

Iterates over each dataclass field recursively; lists, dicts, and nested dataclasses will likewise be initialized as expected.

When directly invoking this function, an optional Meta configuration for the dataclass can be specified via LoadMeta; by default, this will apply recursively to any nested dataclasses. Here’s a sample usage of this below:

>>> LoadMeta(key_transform='CAMEL').bind_to(MyClass)
>>> fromdict(MyClass, {"myStr": "value"})
dataclass_wizard.fromlist(cls: Type[T], list_of_dict: List[Dict[str, Any]]) List[T][source]

Converts a Python list object to a list of dataclass instances.

Iterates over each dataclass field recursively; lists, dicts, and nested dataclasses will likewise be initialized as expected.

dataclass_wizard.json_field(keys: str | ~typing.Collection[str], *, all=False, dump=True, default=<dataclasses._MISSING_TYPE object>, default_factory=<dataclasses._MISSING_TYPE object>, init=True, repr=True, hash=None, compare=True, metadata=None)[source]

This is a helper function that sets the same defaults for keyword arguments as the dataclasses.field function. It can be thought of as an alias to dataclasses.field(...), but one which also represents a mapping of one or more JSON key names to a dataclass field.

This is only in addition to the default key transform; for example, a JSON key appearing as “myField”, “MyField” or “my-field” will already map to a dataclass field “my_field” by default (assuming the key transform converts to snake case).

The mapping to each JSON key name is case-sensitive, so passing “myfield” will not match a “myField” key in a JSON string or a Python dict object.

keys is a string, or a collection (list, tuple, etc.) of strings. It represents one of more JSON keys to associate with the dataclass field.

When all is passed as True (default is False), it will also associate the reverse mapping, i.e. from dataclass field to JSON key. If multiple JSON keys are passed in, it uses the first one provided in this case. This mapping is then used when to_dict or to_json is called, instead of the default key transform.

When dump is passed as False (default is True), this field will be skipped, or excluded, in the serialization process to JSON.

dataclass_wizard.json_key(*keys: str, all=False, dump=True)[source]

Represents a mapping of one or more JSON key names for a dataclass field.

This is only in addition to the default key transform; for example, a JSON key appearing as “myField”, “MyField” or “my-field” will already map to a dataclass field “my_field” by default (assuming the key transform converts to snake case).

The mapping to each JSON key name is case-sensitive, so passing “myfield” will not match a “myField” key in a JSON string or a Python dict object.

Parameters:
  • keys – A list of one of more JSON keys to associate with the dataclass field.

  • all – True to also associate the reverse mapping, i.e. from dataclass field to JSON key. If multiple JSON keys are passed in, it uses the first one provided in this case. This mapping is then used when to_dict or to_json is called, instead of the default key transform.

  • dump – False to skip this field in the serialization process to JSON. By default, this field and its value is included.

dataclass_wizard.property_wizard(*args, **kwargs)[source]

Adds support for field properties with default values in dataclasses.

For examples of usage, please see the Using Field Properties section in the docs. I also added an answer on a SO article that deals with using such properties in dataclasses.