<< Back to posts

Helpful PDB Commands

Posted on June 17, 2024 • Tags: python debugging

pdb is Python’s built-in interactive debugger. Here is a list of helpful pdb commands to debug Python programs.

Automatically break at error

This is literally a cheatcode for debugging Python programs.

If you run your python program by prefixing -m pdb , it will automatically set a breakpoint + open a debugging session whenever an uncaught exception occurs.

python3 -m pdb script.py

PDB Commands

Within a pdb session…

Visibility

l: Show lines before/after current breakpoint

Command: l (“list”)

Example:

> /share/pi/nigam/mwornow/hf_ehr/hf_ehr/eval/finetune.py(202)main()
-> logits: Float[torch.Tensor, 'B L C'] = model(**batch)
(Pdb) l
198                         print(t)
199                     raise e
200  
201                 # Run model
202  ->             logits: Float[torch.Tensor, 'B L C'] = model(**batch)
203  
204                 # Compute CE loss
205                 binary_labels = torch.tensor(values, device=args.device).long()
206                 loss = criterion(logits, binary_labels)

w: Print stack trace

Command: w (“where”)

Example:

(Pdb) w
  /share/pi/nigam/mwornow/hf_ehr/hf_ehr/eval/finetune.py(220)<module>()
-> main()
> /share/pi/nigam/mwornow/hf_ehr/hf_ehr/eval/finetune.py(202)main()
-> logits: Float[torch.Tensor, 'B L C'] = model(**batch)

u: Go up one level of stack trace

Command: u (“up”)

Example:

(Pdb) w
  /share/pi/nigam/mwornow/hf_ehr/hf_ehr/eval/finetune.py(220)<module>()
-> main()
> /share/pi/nigam/mwornow/hf_ehr/hf_ehr/eval/finetune.py(202)main()
-> logits: Float[torch.Tensor, 'B L C'] = model(**batch)

(Pdb) u
> /share/pi/nigam/mwornow/hf_ehr/hf_ehr/eval/finetune.py(220)<module>()
-> main()

d: Go down one level of stack trace

Command: d (“down”)

Example:

(Pdb) w
> /share/pi/nigam/mwornow/hf_ehr/hf_ehr/eval/finetune.py(220)<module>()
-> main()
  /share/pi/nigam/mwornow/hf_ehr/hf_ehr/eval/finetune.py(202)main()
-> logits: Float[torch.Tensor, 'B L C'] = model(**batch)

(Pdb) d
> /share/pi/nigam/mwornow/hf_ehr/hf_ehr/eval/finetune.py(202)main()
-> logits: Float[torch.Tensor, 'B L C'] = model(**batch)

Running Code

n: Advance to next line

Command: n (“next”)

Example:

(Pdb) l
198                         print(t)
199                     raise e
200  
201                 # Run model
202  ->             logits: Float[torch.Tensor, 'B L C'] = model(**batch)
203  
204                 # Compute CE loss
205                 binary_labels = torch.tensor(values, device=args.device).long()
206                 loss = criterion(logits, binary_labels)
207  

(Pdb) n
> /share/pi/nigam/mwornow/hf_ehr/hf_ehr/eval/finetune.py(206)main()
-> binary_labels = torch.tensor(values, device=args.device).long()

(Pdb) l
201                 # Run model
202                 logits: Float[torch.Tensor, 'B L C'] = model(**batch)
203  
204                 # Compute CE loss
205  ->             binary_labels = torch.tensor(values, device=args.device).long()
206                 loss = criterion(logits, binary_labels)
207  
208                 # Backward pass and optimization step
209                 optimizer.zero_grad()
210                 loss.backward()

c: Continue until next breakpoint

Command: c (“continue”)

Example:

(Pdb) l
198                         print(t)
199                     raise e
200  
201                 # Run model
202                 breakpoint()
203  ->             logits: Float[torch.Tensor, 'B C'] = model(**batch)
204  
205                 # Compute CE loss
206                 binary_labels: Float[torch.Tensor, 'B'] = torch.tensor(values, device=args.device).long()
207                 loss = criterion(logits, binary_labels)
208  

(Pdb) c
> /share/pi/nigam/mwornow/hf_ehr/hf_ehr/eval/finetune.py(211)main()
-> optimizer.zero_grad()

(Pdb) l
206                 binary_labels: Float[torch.Tensor, 'B'] = torch.tensor(values, device=args.device).long()
207                 loss = criterion(logits, binary_labels)
208  
209                 # Backward pass and optimization step
210                 breakpoint()
211  ->             optimizer.zero_grad()
212                 loss.backward()
213                 optimizer.step()
214  
215             # Save the fine-tuned model
216             save_finetuned_model(model)

Breakpoints

b [LINE_NUM]: Add breakpoint to line #LINE_NUM

Command: b [LINE_NUM] (“break”)

Example:

(Pdb) b 200
Breakpoint 4 at /share/pi/nigam/mwornow/hf_ehr/hf_ehr/eval/ehrshot.py:200

(Pdb) break
Num Type         Disp Enb   Where
4   breakpoint   keep yes   at /share/pi/nigam/mwornow/hf_ehr/hf_ehr/eval/ehrshot.py:200

break: Print out all active breakpoints

Command: break (“breakpoints”)

Example:

(Pdb) break
Num Type         Disp Enb   Where
1   breakpoint   keep yes   at /share/pi/nigam/mwornow/hf_ehr/hf_ehr/eval/ehrshot.py:200

clear [B_NUM]: Deletes breakpoint #B_NUM

Command: clear [B_NUM] (“clear”)

Example:

(Pdb) break
Num Type         Disp Enb   Where
5   breakpoint   keep yes   at /share/pi/nigam/mwornow/hf_ehr/hf_ehr/eval/ehrshot.py:200
6   breakpoint   keep yes   at /share/pi/nigam/mwornow/hf_ehr/hf_ehr/eval/ehrshot.py:201

(Pdb) clear 5
Deleted breakpoint 5 at /share/pi/nigam/mwornow/hf_ehr/hf_ehr/eval/ehrshot.py:200

(Pdb) break
Num Type         Disp Enb   Where
6   breakpoint   keep yes   at /share/pi/nigam/mwornow/hf_ehr/hf_ehr/eval/ehrshot.py:201