Skip to content

Code usage

Code Usage

The code can be operated in 2 separated ways:

  • 1) Fully automated pipeline
  • 2) Manual Usage of Modules

1)Fully automated pipeline

which analyzes the entire study automaticaly and logs the results (final biologic penetration profiles) into mlflow. (All steps e.g. segmented masks are stored on the hard disk).

This approach requires the usage of config.json file, which contains details regarding the pipeline settings (see config page).

Once the config.json is prepared, the code can be used via either of the two files bellow:

  • main.py (Command line version)
  • master_script.py (Script version which may be operated from jupyter lab or any other python IDE)

main.py

this file used for the CommandLine (terminal) interface. This file shall be used when the entire analysis should be performed automaticaly and one in intested only in inspecting the results in mlflow.

Using a command line interface (terminal), the code can be run as:

main module: module to run the whole pipleline

main(config_path)

Main Function for the entire pipeline for automated usage. It wraps the entire pipeline into one function which can be sourced from the terminal. Entire process is fully automated and uses config.json file. See Config file documentation

Parameters

config_path : (pathlib.PosixPath) Relative path to the config file defining the entire process (Notice the expected folder structure).

3D_Tumor_Lightsheet_Analysis_Pipeline
└─ data
   └─ your_study_name
        └─ config.json
        └─ source
            └─raw
               └─tumor
               │   └─ 5IT-4X_Ch2_z0300.tiff
               │   └─    ...
               │   └─ 5IT-4X_Ch2_z1300.tiff
               ├─vessel
               │   └─ 5IT-4X_Ch3_z0300.tiff
               │   └─    ...
               │   └─ 5IT-4X_Ch3_z1300.tiff
               │─virus
                   └─ 5IT-4X_Ch1_z0300.tiff
                   └─    ...
                   └─5IT-4X_Ch1_z1300.tiff

Returns

Results are saved on the disk (segmented masks, distance transform) and saved to mlflow (if provided in config).

Example Usage

>>>conda activate 3d
>>>python main.py --Config File: data/5IT_DUMMY_STUDY/config.json
Source code in 3D_Tumor_Lightsheet_Analysis_Pipeline/main.py
def main(config_path: pathlib.PosixPath) -> None:

    """

    Main Function for the entire pipeline for automated usage.
    It wraps the entire pipeline into one function which can be sourced from the terminal.
    Entire process is fully automated and uses **config.json** file.
    See [Config file documentation](config.md)

    Parameters
    ----------

    **config_path** : *(pathlib.PosixPath)* Relative path to the config file defining the entire process (Notice the expected folder structure).

    ```bash
    3D_Tumor_Lightsheet_Analysis_Pipeline
    └─ data
       └─ your_study_name
            └─ config.json
            └─ source
                └─raw
                   └─tumor
                   │   └─ 5IT-4X_Ch2_z0300.tiff
                   │   └─    ...
                   │   └─ 5IT-4X_Ch2_z1300.tiff
                   ├─vessel
                   │   └─ 5IT-4X_Ch3_z0300.tiff
                   │   └─    ...
                   │   └─ 5IT-4X_Ch3_z1300.tiff
                   │─virus
                       └─ 5IT-4X_Ch1_z0300.tiff
                       └─    ...
                       └─5IT-4X_Ch1_z1300.tiff
    ```

    Returns
    ------

    Results are saved on the disk (segmented masks, distance transform) and saved to mlflow (if provided in config).


    Example Usage
    --------------

    ```python

    >>>conda activate 3d
    >>>python main.py --Config File: data/5IT_DUMMY_STUDY/config.json

    ```

    """

    # Config file
    config_path = root_directory_path.joinpath(config_path)
    experiment = load_config(config_path)

    # script name to log in MLFLOW
    SCRIPT_NAME = "main.py"

    #######################################
    # # DATA PREPROCESSING
    # #####################################

    experiment["data"]["source"]["transformed"] = data_preprocessing_wrapper(
        experiment["data"]["source"]["raw"]
    )

    #######################################
    # # SEGMENTATION
    # #####################################

    # segmentation of blood vessels
    out_path = segmentation_wrapper(
        experiment["data"]["source"]["transformed"]["vessel"],
        **experiment["segmentation_method_vessel"],
    )
    experiment["data"]["results"]["segmentation"]["vessel"] = out_path

    # segmentation of tumors
    out_path = segmentation_wrapper(
        experiment["data"]["source"]["transformed"]["tumor"],
        **experiment["segmentation_method_tumor"],
    )
    experiment["data"]["results"]["segmentation"]["tumor"] = out_path

    # # postprocessing tumor masks
    out_path = postprocess_masks(
        experiment["data"]["results"]["segmentation"]["tumor"],
        **experiment["segmentation_postprocessing_tumor"],
    )
    experiment["data"]["results"]["segmentation_postprocessing"][
        "tumor"
    ] = out_path

    #######################################
    # # DISTANCE TRANSFORM
    # #####################################

    out_path = calculate_distance_tranform(
        experiment["data"]["results"]["segmentation"]["vessel"],
        **experiment["distance_tranform"]["method_parameters"],
    )
    experiment["data"]["results"]["distance_transform"]["vessel"] = out_path

    #######################################
    # # PROFILE
    # #####################################

    # calculating the final profile
    profiles = calculate_profile(
        experiment["data"]["source"]["transformed"]["virus"],
        experiment["data"]["results"]["distance_transform"]["vessel"],
        experiment["data"]["results"]["segmentation_postprocessing"]["tumor"],
        experiment["pixels_to_microns"],
        force_overwrite=False,
    )

    #######################################
    # # ML-FLOW LOGGING
    # #####################################
    print(MLFLOW_EXPERIMENT_NAME)
    if experiment["mlflow_logging"]:
        mlflow_logging(
            experiment,
            profiles,
            MLFLOW_TRACKING_URI,
            MLFLOW_EXPERIMENT_NAME,
            experiment["mlflow_run_name"],
            SCRIPT_NAME,
        )

master_script.py

This file is a script version of main.py file. This means that it allows for interactive (cell-by-cell) usage.

2)Manual Usage of Modules

One can choose to run modules individually - e.g. only for blood vessels segmentations/distance transform. This allows for the further customization of the code. This approach does not require the config.json file.

See the Documentation of individual modules for further details: Preprocessing, Segmentation, Postprocessing, Distance Transform, Profiles.

Even when using Modules individually, the function's documentation can still be accesed within the jupyterlab. When using Contextual Helper we see the functions documentation (see below).