Week 1 HW: Principles and Practices

cover image cover image
View Image Credits

Image courtesy of Vincent Muir

Q1. Describe a biological engineering application or tool you want to develop and why:

Concept: Bio-Circuit for CO₂ Sensing and Reduction

An idea I have is creating a biological circuit designed to sense CO₂ emissions (or a chemical indicative of excess CO₂ production/presence). The higher goal would be to design a self-sustaining biological system capable of reducing CO₂ emissions with applications for enhancing green climate technology. With modern biosynthesis tools, I envision being able to modify a signal cascade pathway to trigger a fluorescent response for detection, then modify a protein like Rubisco to engineer greater carbon fixation as a potential method for emission reduction.

Goal: Biosafety and Biocontainment

The biggest concern related to applications of this idea would be biosafety. When working with biological systems, it is important to ensure that biological agents are contained, especially considering this is a living system. The release of synthetic systems into the environment could prove detrimental given the potential for uncontrolled growth and competition with natural systems. It is crucial that my system can “self-prune” or regulate itself to control excess growth of this nature.

Q3. Describe at least three different potential governance “actions.”

  1. Mandatory DNA Synthesis Screening (Option 1): I propose modifying the actions of commercial DNA synthesis companies and academic researchers. To my knowledge, DNA synthesis screening is typically voluntary and/or specified to a small number of target pathogens. I propose enforcing mandatory screening of all synthetic DNA orders against a broad, consistently updated database of functional genomic markers, regardless of whether they belong to a known pathogen. This would hopefully lead to the characterization of more sequences responsible for hazardous biological activity. Implementation would likely require a standard technical protocol that all synthesis providers must use. Success would require major industry players to opt-in to prevent “offshoring” orders to countries to mitigate the likely associated costs. This assumes current screening algorithms are able to accurately distinguish between benign research and flagged motif sequences without a high rate of false positives.

  2. Incentivizing Responsible Research via Insurance (Option 2): I propose a model to incentivize responsible research by changing how actors (private insurance companies, research institutions, etc.) behave systematically. Biorisk management is often seen as a bureaucratic cost. Under this model, institutions would receive lower insurance premiums upon demonstration of high-quality biorisk oversight. This could include frequent independent audits, mandatory training, and transparency with quality assessments. This is only executable assuming that private insurers have the technical expertise to judge scientific risk, and that the financial savings from lower premiums are a large enough incentive to reshape institutional behavior.

  3. Mandatory Transparency in Research Publications (Option 3): Research papers typically do not acknowledge the potential for misuse of new findings. This proposal would create a rule mandating that no federal funding be given, nor publications in major journals be accepted, without a clearly detailed section outlining potential risks and mitigation strategies. Regulators must create a standardized template to ensure quality and compliance. Editing staff would now also include biosecurity reviewers to ensure quality and evaluate these statements before publication. This assumes that scientists are able to envision potential misuse cases of their own work, a task growing in difficulty given assisted ideation with AI.

Q4. Governance Scoring Matrix

Does the option:Option 1
(Screening)
Option 2
(Insurance)
Option 3
(Transparency)
Enhance Biosecurity
• By preventing incidentsHigh. Acts as a physical gatekeeper preventing the creation of hazardous sequences.Medium. Encourages safety culture but doesn’t physically stop bad actors.Low. Relies on post-hoc review; good for awareness but doesn’t prevent creation.
• By helping respondHigh. Creates a digital paper trail of who ordered what sequence.Medium. Audit trails helps liability but not immediate biological response.Medium. Ensures mitigation strategies are pre-thought out and published.
Foster Lab Safety
• By preventing incidentLow. Focuses on the “what” (DNA), not the “how” (handling).High. Directly mandates training and oversight of daily lab practices.Low. Administrative in nature.
• By helping respondLow. Not relevant to immediate lab accidents.High. Insurance protocols would mandate accident reporting/response plans.Low.
Protect the environment
• By preventing incidentsHigh. Prevents synthesis of invasive/modified traits before release.Medium. Better oversight leads to better containment protocols.Low.
• By helping respondMedium. Database allows rapid identification of escaped synthetic organisms.Medium. Funding available for cleanup/remediation via insurance.Medium. Publication strategies may include kill-switch documentation.
Other considerations
• Minimizing costs/burdensLow. High technical and administrative burden on providers.Low. High upfront cost for institutions to reorganize compliance.Medium. Adds writing/review time, but low financial cost.
• Feasibility?Medium. Technology exists, but requires international buy-in.Low. Market forces may not support this without regulation.High. Journals/Grants can easily add this requirement.
• Not impede researchLow. False positives could delay legitimate experiments.Medium. Could create cost barriers for small labs/startups.Medium. Scientists may self-censor or fear “hazard” labeling.
• Promote constructive appsHigh. Builds trust that the foundation of bio-economy is safe.High. Professionalizes the industry.Medium. Increases public trust through transparency.

Q5. Drawing upon this scoring, describe which governance option, or combination of options, you would prioritize, and why.

In order to combat an evolving landscape trending towards increased biological threats, I recommend that national regulatory bodies (e.g., NIH) prioritize an integrated governance strategy that mandates DNA Synthesis Screening (Option 1). This will function as a gatekeeper, providing a platform to embed Responsible Research Oversight into the budget cycle. This multi-tiered approach assumes that screening stays on pace with AI-driven pathogen design while additionally assuming international industry cooperation to prevent a financial “race to the bottom” in safety standards. The primary trade-off is adding to the administrative burden on researchers, potentially delaying legitimate innovation; however, the combined approach minimizes the risk of accidental (or deliberate) release by creating a physical barrier through sequence production and a procedural/administrative barrier through rigorous researcher training and compliance efforts.

Weekly Reflection

Reflecting on week one of How to Grow Almost Anything 2026, the core ethical challenge that stood out to me centered on the “dual-use dilemma.” This makes logical sense; as we develop more advanced biosynthetic tools, the availability of such tools in non-centralized nodes increases, limiting the ability to control how the technology is used. Additionally, it is slightly concerning to think that AI-driven design with synthetic biology has the ability to produce novel toxins that would be unrecognizable. I would propose a strategy that assumes that “safety-by-design” can mitigate risks without stifling the creative freedom central to the HTGAA mission. This would ensure that labs with access to technology are required to follow compliance protocols that prevent widespread abuse of new technologies for nefarious intentions. I believe it is reasonable to suggest increasing the administrative load for the purpose of maintaining access to groundbreaking technology safely.


Homework Questions

Professor Jacobson: Polymerase & Coding

1. Nature’s machinery for copying DNA is called polymerase. What is the error rate of polymerase? How does this compare to the length of the human genome. How does biology deal with that discrepancy?

  • Error Rate: Polymerases have a natural error rate of 10^-5 (1 in 100,000 wrong bases).
  • Comparison: When compared to the length of the human genome, which is approximately 3 x 10^9 (3 billion) base pairs long, this would result in 30,000+ incorrectly copied base pairs per division, which would be detrimental.
  • Correction Mechanism: Biology deals with this discrepancy through proofreading in the form of 3’ to 5’ exonuclease activity, essentially stalling enzyme activity whenever an incorrect match is made and editing it. Additionally, there are protein complexes responsible for mismatch repair that proofread replicated DNA once completed, bringing the error rate down to approximately 10^-9.

2. How many different ways are there to code (DNA nucleotide code) for an average human protein? In practice what are some of the reasons that all of these different codes don’t work to code for the protein of interest?

  • Coding Possibilities: The genetic code contains innate degeneracy, which helps to conserve functional loss whenever a mismatch occurs. Amino acids contain as many as 6 degenerate codons. Considering an average of 3 codons per amino acid and 400 amino acids per human protein on average, there are roughly 3^400 distinct DNA sequences that can code for a single protein.
  • Practical Limitations: In practice, not all of these will code for the protein of interest efficiently. This is partially due to codon usage bias, where the machinery prefers specific coding motifs based on tRNA availability. Additionally, not all DNA is used to code for a protein; splicing of mRNA for transcription changes what code is conserved and, as a result, what protein is actually synthesized.

Dr. LeProust: DNA Synthesis

3. What’s the most commonly used method for oligo synthesis currently? Currently, solid-phase phosphoramidite chemistry is the most commonly used method for oligo synthesis. This utilizes a technique where a computer controls the chemical workflow to build DNA chains in the opposite direction (3’ to 5’).

4. Why is it difficult to make oligos longer than 200nt via direct synthesis? Because chemical reactions are imperfect, even given a 99.5% efficiency rate, an oligo of 200nt would only be 0.995^200 accurate (approx. 36% yield). This means more than half of the synthesized DNA would be inaccurate or failed sequences.

5. Why can’t you make a 2000bp gene via direct oligo synthesis? Utilizing the same math from earlier, 0.995^2000 results in 0.004% accuracy. Effectively, only 1 in 25,000 “genes” synthesized would be accurate, making it chemically impossible to isolate the correct sequence from the mixture of failures.

George Church: The Lysine Contingency

6. What are the 10 essential amino acids in all animals, and how does this affect your view of the “Lysine Contingency”?

  • Essential Amino Acids: The 10 essential amino acids cannot be synthesized by mammalian cellular machinery and must be consumed through food. They are: Arginine, Histidine, Isoleucine, Leucine, Lysine, Methionine, Phenylalanine, Threonine, Tryptophan, and Valine.
  • The Lysine Contingency: The “Lysine Contingency” from Jurassic Park is the theoretical fail-safe engineered to prevent dinosaurs from surviving in the wild. However, since lysine is already an essential amino acid, the dinosaurs would naturally need to eat lysine in their diet to survive regardless of genetic engineering. Therefore, removing their ability to produce lysine has the same effect as a standard starvation diet, serving as no additional protective measure.

Gemini AI was consulted for formatting