Exercises NotebookMath for LLMs

Einstein Summation and Index Notation

Mathematical Foundations / Einstein Summation and Index Notation

Run notebook
Private notes
0/8000

Notes stay private to your browser until account sync is configured.

Exercises Notebook

Exercises Notebook

Converted from exercises.ipynb for web reading.

Einstein Summation and Index Notation - Exercises

This notebook contains 10 progressive exercises for 05-Einstein-Summation-and-Index-Notation. Each exercise has a learner workspace followed by a complete reference solution. The goal is fluent foundational math for later linear algebra, calculus, probability, and ML sections.

Code cell 2

import numpy as np
import matplotlib.pyplot as plt
import matplotlib as mpl

try:
    import seaborn as sns
    sns.set_theme(style="whitegrid", palette="colorblind")
    HAS_SNS = True
except ImportError:
    plt.style.use("seaborn-v0_8-whitegrid")
    HAS_SNS = False

mpl.rcParams.update({
    "figure.figsize":    (10, 6),
    "figure.dpi":         120,
    "font.size":           13,
    "axes.titlesize":      15,
    "axes.labelsize":      13,
    "xtick.labelsize":     11,
    "ytick.labelsize":     11,
    "legend.fontsize":     11,
    "legend.framealpha":   0.85,
    "lines.linewidth":      2.0,
    "axes.spines.top":     False,
    "axes.spines.right":   False,
    "savefig.bbox":       "tight",
    "savefig.dpi":         150,
})
np.random.seed(42)
print("Plot setup complete.")

Code cell 3

import numpy as np
import numpy.linalg as la
from decimal import Decimal, getcontext
from itertools import product

np.set_printoptions(precision=8, suppress=True)
np.random.seed(42)

def header(title):
    print("\n" + "=" * len(title))
    print(title)
    print("=" * len(title))

def check_true(name, cond):
    ok=bool(cond)
    print(f"{'PASS' if ok else 'FAIL'} - {name}")
    return ok

def check_close(name, got, expected, tol=1e-8):
    ok=np.allclose(got, expected, atol=tol, rtol=tol)
    print(f"{'PASS' if ok else 'FAIL'} - {name}")
    if not ok:
        print('  got     =', got)
        print('  expected=', expected)
    return ok

def powerset(s):
    items=list(s)
    return [set(items[i] for i in range(len(items)) if mask & (1 << i)) for mask in range(1 << len(items))]

print("Chapter 01 helper setup complete.")

Exercise 1: Dot Product

Use einsum for aibia_i b_i.

Code cell 5

# Your Solution
# Exercise 1 - learner workspace
# Write your solution here, then run the reference solution below to compare.
print("Learner workspace ready for Exercise 1.")

Code cell 6

# Solution
# Exercise 1 - Dot Product
header("Exercise 1: dot")
a=np.array([1.,2.,3.]); b=np.array([4.,5.,6.])
check_close("dot", np.einsum('i,i->',a,b), a@b)

Exercise 2: Matrix Vector

Use einsum for AijxjA_{ij}x_j.

Code cell 8

# Your Solution
# Exercise 2 - learner workspace
# Write your solution here, then run the reference solution below to compare.
print("Learner workspace ready for Exercise 2.")

Code cell 9

# Solution
# Exercise 2 - Matrix Vector
header("Exercise 2: matvec")
A=np.arange(6.).reshape(2,3); x=np.array([1.,0.,-1.])
check_close("matvec", np.einsum('ij,j->i',A,x), A@x)

Exercise 3: Matrix Matrix

Use einsum for AikBkjA_{ik}B_{kj}.

Code cell 11

# Your Solution
# Exercise 3 - learner workspace
# Write your solution here, then run the reference solution below to compare.
print("Learner workspace ready for Exercise 3.")

Code cell 12

# Solution
# Exercise 3 - Matrix Matrix
header("Exercise 3: matmul")
A=np.arange(6.).reshape(2,3); B=np.arange(12.).reshape(3,4)
check_close("matmul", np.einsum('ik,kj->ij',A,B), A@B)

Exercise 4: Batch Matrix Multiply

Use einsum for batched products.

Code cell 14

# Your Solution
# Exercise 4 - learner workspace
# Write your solution here, then run the reference solution below to compare.
print("Learner workspace ready for Exercise 4.")

Code cell 15

# Solution
# Exercise 4 - Batch Matrix Multiply
header("Exercise 4: batch matmul")
A=np.arange(24.).reshape(2,3,4); B=np.arange(40.).reshape(2,4,5)
check_close("batch", np.einsum('bik,bkj->bij',A,B), np.matmul(A,B))

Exercise 5: Trace

Compute AiiA_{ii} with einsum.

Code cell 17

# Your Solution
# Exercise 5 - learner workspace
# Write your solution here, then run the reference solution below to compare.
print("Learner workspace ready for Exercise 5.")

Code cell 18

# Solution
# Exercise 5 - Trace
header("Exercise 5: trace")
A=np.arange(9.).reshape(3,3)
check_close("trace", np.einsum('ii->',A), np.trace(A))

Exercise 6: Attention Scores

Compute QidKjd/dQ_{id}K_{jd}/\sqrt d.

Code cell 20

# Your Solution
# Exercise 6 - learner workspace
# Write your solution here, then run the reference solution below to compare.
print("Learner workspace ready for Exercise 6.")

Code cell 21

# Solution
# Exercise 6 - Attention Scores
header("Exercise 6: attention scores")
Q=np.array([[1.,0.],[1.,1.]]); K=np.array([[1.,1.],[0.,1.],[1.,0.]])
S=np.einsum('id,jd->ij',Q,K)/np.sqrt(Q.shape[1])
check_close("scores", S, Q@K.T/np.sqrt(2))

Exercise 7: Covariance

Compute covariance Cij=XniXnj/(n1)C_{ij}=X_{ni}X_{nj}/(n-1).

Code cell 23

# Your Solution
# Exercise 7 - learner workspace
# Write your solution here, then run the reference solution below to compare.
print("Learner workspace ready for Exercise 7.")

Code cell 24

# Solution
# Exercise 7 - Covariance
header("Exercise 7: covariance")
X=np.array([[1.,2.],[2.,1.],[3.,4.]])
Xc=X-X.mean(axis=0)
C=np.einsum('ni,nj->ij',Xc,Xc)/(X.shape[0]-1)
check_close("covariance", C, Xc.T@Xc/(X.shape[0]-1))

Exercise 8: Tensor Contraction

Contract a rank-3 tensor with a vector over one index.

Code cell 26

# Your Solution
# Exercise 8 - learner workspace
# Write your solution here, then run the reference solution below to compare.
print("Learner workspace ready for Exercise 8.")

Code cell 27

# Solution
# Exercise 8 - Tensor Contraction
header("Exercise 8: tensor contraction")
T=np.arange(24.).reshape(2,3,4); v=np.array([1.,0.,-1.])
Y=np.einsum('ijk,j->ik',T,v)
check_close("shape", np.array(Y.shape), np.array([2,4]))
check_close("manual", Y[0], T[0,0]-T[0,2])

Exercise 9: Diagonal Extraction

Use repeated index notation to extract a diagonal.

Code cell 29

# Your Solution
# Exercise 9 - learner workspace
# Write your solution here, then run the reference solution below to compare.
print("Learner workspace ready for Exercise 9.")

Code cell 30

# Solution
# Exercise 9 - Diagonal Extraction
header("Exercise 9: diagonal")
A=np.arange(16.).reshape(4,4)
d=np.einsum('ii->i',A)
check_close("diag", d, np.diag(A))

Exercise 10: Einsum Optimization

Compare a two-step contraction with a direct einsum.

Code cell 32

# Your Solution
# Exercise 10 - learner workspace
# Write your solution here, then run the reference solution below to compare.
print("Learner workspace ready for Exercise 10.")

Code cell 33

# Solution
# Exercise 10 - Einsum Optimization
header("Exercise 10: contraction path")
A=np.arange(6.).reshape(2,3); B=np.arange(12.).reshape(3,4); C=np.arange(20.).reshape(4,5)
Y=np.einsum('ik,kj,jl->il',A,B,C)
check_close("direct equals chained", Y, (A@B)@C)

Skill Check

Test this lesson

Answer 4 quick questions to lock in the lesson and feed your adaptive practice queue.

--
Score
0/4
Answered
Not attempted
Status
1

Which module does this lesson belong to?

2

Which section is covered in this lesson content?

3

Which term is most central to this lesson?

4

What is the best way to use this lesson for real learning?

Your answers save locally first, then sync when account storage is available.
Practice queue