Homework

Weekly homework submissions:

  • Week 1 HW: Principles and Practices

    WEEKLY ASSIGNMENT Ethical concerns that came up especially new to me “Halfpipe of Doom” / dual-use is not an edge case — it’s the default. Many tools in synthetic biology are structurally dual-use: the same capability that enables public good (vaccines, diagnostics, remediation) can also enable harm (accidents, misuse, weaponization). This shifts ethics from “don’t do bad things” to “assume capabilities will be repurposed.” Harm can be created while trying to prevent harm. The pandemic example (engineering SARS-CoV-2) made this concrete: building countermeasure capability can also expand the risk surface (accidental release, intentional misuse, normalization of high-risk methods). The ethical concern here is not only intent, but second-order effects and indirect harms. Responsibility is diffuse across an ecosystem, not located in one lab. The governance framework (goals × actors × actions) highlighted a new point: risk isn’t just about a “bad actor.” It’s distributed across supply chains and institutions (gene firms, oligo manufacturers, synthesizer makers, end users). Ethical questions become: who has leverage to prevent incidents? who bears cost? who is accountable when the system fails? Trust is a technical constraint, not a PR layer. The Asilomar discussion reframed “public acceptance” as part of the engineering problem. Even a scientifically sound technology can fail socially if institutions are not trusted, if governance is opaque, or if the public feels excluded from deciding what futures are preferred. Biotech is becoming economic and geopolitical infrastructure. Bio-apps → bio-economies → bio-power made it clear that biotech is not just science; it’s tied to national strategies, competition, and incentives. That raises ethical risks around: profit-driven priorities, uneven access, and “who benefits” vs “who is exposed.” Governance actions that seem appropriate: