atlinter.optical_flow module

Optical flow computation based on pair of images.

class atlinter.optical_flow.GeneOpticalFlow(gene_data: GeneDataset, reference_volume: ndarray, model: OpticalFlow)[source]

Bases: object

Computation of optical flow for gene dataset.

Parameters:
  • gene_data – Gene Dataset. It contains a volume of reference shape with all known slices located at the right place and a metadata dictionary containing information about the axis of the dataset and the section numbers.

  • reference_volume – Reference volume used to compute optical flow. It needs to be of same shape as gene_data.volume.

  • model – Model computing flow between two images.

predict_ref_flow(idx_from: int, idx_to: int) ndarray[source]

Compute optical flow between two given slices of the reference volume.

Parameters:
  • idx_from – First section to consider

  • idx_to – Second section to consider.

Returns:

flow – Predicted flow between the two given sections of the reference volume.

Return type:

np.ndarray

Raises:

ValueError – If one of the idx_from and idx_to is out of the boundaries of the reference space.

predict_slice(slice_number: int) ndarray[source]

Predict one gene slice.

Parameters:

slice_number – Slice section to predict.

Returns:

Predicted gene slice. Array of shape (dim1, dim2, 3) being (528, 320) for sagittal dataset and (320, 456) for coronal dataset.

Return type:

np.ndarray

predict_volume() ndarray[source]

Predict entire volume with known gene slices.

This function might be slow.

Returns:

Entire gene volume. Array of shape of the volume GeneDataset.

Return type:

np.ndarray

class atlinter.optical_flow.MaskFlowNet(checkpoint_path, gpu_device='')[source]

Bases: OpticalFlow

MaskFlowNet model for optical flow computation.

The typical use is

>>> from atlinter.optical_flow import MaskFlowNet
>>> checkpoint = "data/checkpoints/maskflownet.params"
>>> # Make sure the checkpoint exists and uncomment the following line
>>> # net = MaskFlowNet(checkpoint)
Parameters:
  • checkpoint_path (str) – Path to checkpoint of the model. See references for possibilities.

  • gpu_device (str) – Device to load optical flow model.

network = <module 'atlinter.vendor.MaskFlowNet.network' from '/home/docs/checkouts/readthedocs.org/user_builds/atlas-interpolation/envs/latest/lib/python3.7/site-packages/atlinter/vendor/MaskFlowNet/network/__init__.py'>
predict_flow(img1, img2)[source]

Compute optical flow between two images.

Parameters:
  • img1 (np.ndarray) – The left image.

  • img2 (np.ndarray) – The right image.

Returns:

flow – The optical flow.

Return type:

np.ndarray

predict_new_data = <module 'atlinter.vendor.MaskFlowNet.predict_new_data' from '/home/docs/checkouts/readthedocs.org/user_builds/atlas-interpolation/envs/latest/lib/python3.7/site-packages/atlinter/vendor/MaskFlowNet/predict_new_data.py'>
preprocess_images(img1, img2)[source]

Preprocess image.

Parameters:
  • img1 (np.ndarray) – The left image.

  • img2 (np.ndarray) – The right image.

Returns:

  • img1 (np.ndarray) – The pre-processed left image.

  • img2 (np.ndarray) – The pre-processed right image.

class atlinter.optical_flow.OpticalFlow[source]

Bases: ABC

Class representing optical flow model.

abstract predict_flow(img1, img2)[source]

Compute optical flow between two images.

Parameters:
  • img1 (np.ndarray) – The left image.

  • img2 (np.ndarray) – The right image.

Returns:

flow – The optical flow of shape (*image.shape, 2).

Return type:

np.ndarray

preprocess_images(img1, img2)[source]

Preprocess image.

Parameters:
  • img1 (np.ndarray) – The left image.

  • img2 (np.ndarray) – The right image.

Returns:

  • img1 (np.ndarray) – The pre-processed left image.

  • img2 (np.ndarray) – The pre-processed right image.

classmethod warp_image(flow, img2, order=1)[source]

Warp image with the predicted flow.

Parameters:
  • flow (np.ndarray) – The predicted optical flow of shape (*image.shape, 2).

  • img2 (np.ndarray) – The right image.

  • order (int) – The interpolation order. 0 = nearest neighbour, 1 = linear, 2 = cubic, etc.

Returns:

warped – The warped image.

Return type:

np.ndarray

class atlinter.optical_flow.RAFTNet(path, device='cpu')[source]

Bases: OpticalFlow

RAFT model for optical flow computation.

The typical use is

>>> from atlinter.optical_flow import RAFTNet
>>> checkpoint = "data/checkpoints/RAFT/models/raft-things.pth"
>>> # Make sure the checkpoint exists and uncomment the following line
>>> # net = RAFTNet(checkpoint)
Parameters:
  • path (str) – Path to the RAFT model.

  • device ({'cpu', 'cuda'}) – Device to load optical flow model.

class RAFT(args)

Bases: Module

forward(image1, image2, iters=12, flow_init=None, upsample=True, test_mode=False)

Estimate optical flow between pair of frames

freeze_bn()
initialize_flow(img)

Flow is represented as difference between two coordinate grids flow = coords1 - coords0

training: bool
upsample_flow(flow, mask)

Upsample flow field [H/8, W/8, 2] -> [H, W, 2] using convex combination

static initialize_namespace()[source]

Initialize namespace needed for RAFT initialization.

Returns:

args – Arguments needed to instantiate RAFT model.

Return type:

argparse.Namespace

predict_flow(img1, img2)[source]

Compute optical flow between two images.

Parameters:
  • img1 (np.ndarray) – The left image.

  • img2 (np.ndarray) – The right image.

Returns:

flow – The optical flow.

Return type:

np.ndarray

preprocess_images(img1, img2)[source]

Preprocess image.

Parameters:
  • img1 (np.ndarray) – The left image.

  • img2 (np.ndarray) – The right image.

Returns:

  • img1 (np.ndarray) – The pre-processed left image.

  • img2 (np.ndarray) – The pre-processed right image.