Note
Go to the end to download the full example code.
3D Image Orientation¶
This script illustrates the estimation of orientation angles using a synchronization matrix and the voting method, based on simulated data projected from a 3D cryo-EM map.
import os
import numpy as np
from aspire.abinitio import CLSyncVoting
from aspire.operators import RadialCTFFilter
from aspire.source import OrientedSource, Simulation
from aspire.utils import mean_aligned_angular_distance
from aspire.volume import Volume
file_path = os.path.join(
os.path.dirname(os.getcwd()), "data", "clean70SRibosome_vol_65p.mrc"
)
print(
"This script illustrates orientation estimation using "
"synchronization matrix and voting method"
)
This script illustrates orientation estimation using synchronization matrix and voting method
Initialize Simulation Object and CTF Filters¶
# Define a precision for this experiment
dtype = np.float32
# Set the sizes of images
img_size = 33
# Set the total number of images generated from the 3D map
num_imgs = 128
# Specify the CTF parameters not used for this example
# but necessary for initializing the simulation object
voltage = 200 # Voltage (in KV)
defocus_min = 1.5e4 # Minimum defocus value (in angstroms)
defocus_max = 2.5e4 # Maximum defocus value (in angstroms)
defocus_ct = 7 # Number of defocus groups.
Cs = 2.0 # Spherical aberration
alpha = 0.1 # Amplitude contrast
print("Initialize simulation object and CTF filters.")
# Create CTF filters
filters = [
RadialCTFFilter(voltage, defocus=d, Cs=2.0, alpha=0.1)
for d in np.linspace(defocus_min, defocus_max, defocus_ct)
]
Initialize simulation object and CTF filters.
Downsampling¶
# Load the map file of a 70S Ribosome and downsample the 3D map to desired resolution.
# The downsampling can be done by the internal function of Volume object.
print(
f"Load 3D map and downsample 3D map to desired grids "
f"of {img_size} x {img_size} x {img_size}."
)
vols = Volume.load(file_path, dtype=dtype)
vols = vols.downsample(img_size)
Load 3D map and downsample 3D map to desired grids of 33 x 33 x 33.
Create Simulation Object and Obtain True Rotation Angles¶
# Create a simulation object with specified filters and the downsampled 3D map
print("Use downsampled map to creat simulation object.")
sim = Simulation(
L=img_size, n=num_imgs, vols=vols, unique_filters=filters, pixel_size=5, dtype=dtype
)
print("Get true rotation angles generated randomly by the simulation object.")
rots_true = sim.rotations
Use downsampled map to creat simulation object.
Get true rotation angles generated randomly by the simulation object.
Estimate Orientation¶
# Initialize an orientation estimation object and create an
# ``OrientedSource`` object to perform viewing angle and image offset
# estimation. Here, because of the small image size of the
# ``Simulation``, we customize the ``CLSyncVoting`` method to use
# fewer theta values when searching for common-lines between pairs of
# images. Additionally, since we are processing images with no noise,
# we opt not to use a ``fuzzy_mask``, an option that improves
# common-line detection in higher noise regimes.
print(
"Estimate rotation angles and offsets using synchronization matrix and voting method."
)
orient_est = CLSyncVoting(sim, n_theta=36, mask=False)
oriented_src = OrientedSource(sim, orient_est)
rots_est = oriented_src.rotations
Estimate rotation angles and offsets using synchronization matrix and voting method.
Searching over common line pairs: 0%| | 0/8128 [00:00<?, ?it/s]
Searching over common line pairs: 7%|▋ | 578/8128 [00:00<00:01, 5775.32it/s]
Searching over common line pairs: 14%|█▍ | 1171/8128 [00:00<00:01, 5862.19it/s]
Searching over common line pairs: 22%|██▏ | 1762/8128 [00:00<00:01, 5879.92it/s]
Searching over common line pairs: 29%|██▉ | 2353/8128 [00:00<00:00, 5890.90it/s]
Searching over common line pairs: 36%|███▌ | 2943/8128 [00:00<00:00, 5894.06it/s]
Searching over common line pairs: 43%|████▎ | 3533/8128 [00:00<00:00, 5829.57it/s]
Searching over common line pairs: 51%|█████ | 4117/8128 [00:00<00:00, 5708.17it/s]
Searching over common line pairs: 58%|█████▊ | 4711/8128 [00:00<00:00, 5777.91it/s]
Searching over common line pairs: 65%|██████▌ | 5304/8128 [00:00<00:00, 5822.99it/s]
Searching over common line pairs: 73%|███████▎ | 5897/8128 [00:01<00:00, 5855.55it/s]
Searching over common line pairs: 80%|███████▉ | 6489/8128 [00:01<00:00, 5872.66it/s]
Searching over common line pairs: 87%|████████▋ | 7077/8128 [00:01<00:00, 5869.94it/s]
Searching over common line pairs: 94%|█████████▍| 7665/8128 [00:01<00:00, 5869.56it/s]
Searching over common line pairs: 100%|██████████| 8128/8128 [00:01<00:00, 5842.93it/s]
LSQR Least-squares solution of Ax = b
The matrix A has 8128 rows and 256 columns
damp = 0.00000000000000e+00 calc_var = 0
atol = 1.00e-06 conlim = 1.00e+08
btol = 1.00e-06 iter_lim = 512
Itn x[0] r1norm r2norm Compatible LS Norm A Cond A
0 0.00000e+00 2.324e+02 2.324e+02 1.0e+00 2.2e-02
1 1.00500e+00 1.660e+02 1.660e+02 7.1e-01 2.1e-01 7.4e+00 1.0e+00
2 8.32686e-01 1.618e+02 1.618e+02 7.0e-01 4.6e-02 1.0e+01 2.1e+00
3 8.59398e-01 1.613e+02 1.613e+02 6.9e-01 1.4e-02 1.2e+01 3.2e+00
4 9.11518e-01 1.613e+02 1.613e+02 6.9e-01 5.8e-03 1.4e+01 4.4e+00
5 8.95731e-01 1.613e+02 1.613e+02 6.9e-01 2.3e-03 1.5e+01 5.6e+00
6 8.97382e-01 1.613e+02 1.613e+02 6.9e-01 9.9e-04 1.7e+01 6.8e+00
7 8.94089e-01 1.613e+02 1.613e+02 6.9e-01 4.8e-04 1.8e+01 8.2e+00
8 8.91862e-01 1.613e+02 1.613e+02 6.9e-01 2.0e-04 1.9e+01 9.5e+00
9 8.92473e-01 1.613e+02 1.613e+02 6.9e-01 8.7e-05 2.0e+01 1.1e+01
10 8.92054e-01 1.613e+02 1.613e+02 6.9e-01 3.9e-05 2.1e+01 1.2e+01
12 8.91986e-01 1.613e+02 1.613e+02 6.9e-01 8.5e-06 2.3e+01 1.4e+01
13 8.91890e-01 1.613e+02 1.613e+02 6.9e-01 2.8e-06 2.4e+01 1.6e+01
14 8.91890e-01 1.613e+02 1.613e+02 6.9e-01 8.1e-07 2.5e+01 1.7e+01
LSQR finished
The least-squares solution is good enough, given atol
istop = 2 r1norm = 1.6e+02 anorm = 2.5e+01 arnorm = 3.3e-03
itn = 14 r2norm = 1.6e+02 acond = 1.7e+01 xnorm = 2.4e+01
Mean Angular Distance¶
# ``mean_aligned_angular_distance`` will perform global alignment of the estimated rotations
# to the ground truth and find the mean angular distance between them (in degrees).
mean_ang_dist = mean_aligned_angular_distance(rots_est, rots_true)
print(
f"Mean angular distance between estimates and ground truth: {mean_ang_dist} degrees"
)
# Basic Check
assert mean_ang_dist < 10
Mean angular distance between estimates and ground truth: 8.623021125793457 degrees
Offsets Estimation¶
# The ground truth offsets from the simulation can be compared to
# those estimated by the commonlines method above.
# Calculate Estimation error in pixels for each image.
offs_diff = np.sqrt(np.sum((oriented_src.offsets - sim.offsets) ** 2, axis=1))
# Calculate the mean error in pixels across all images.
offs_err = offs_diff.mean()
print(f"Mean offset error in pixels {offs_err}, approx {offs_err/img_size*100:.1f}%")
Mean offset error in pixels 0.732833800415208, approx 2.2%
Total running time of the script: (0 minutes 4.199 seconds)