<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Week 5 HW: Protein Design Part II :: 2026a-iman-karibzhanova</title><link>https://pages.htgaa.org/2026a/iman-karibzhanova/homework/week-05-hw-protein-design-part-ii/index.html</link><description>Part A: SOD1 Binder Peptide Design (From Pranam) Part 1: Generate Binders with PepMLM sp|P00441|SODC_HUMAN Superoxide dismutase [Cu-Zn] OS=Homo sapiens OX=9606 GN=SOD1 PE=1 SV=2 MATKAVCVLKGDGPVQGIINFEQKESNGPVKVWGSIKGLTEGLHGFHVHEFGDNTAGCTS AGPHFNPLSRKHGGPKDEERHVGDLGNVTADKDGVADVSIEDSVISLSGDHCIIGRTLVV HEKADDLGKGGNEESTKTGNAGSRLACGVIGIAQ
SOD1 A4V mutation: MATKVVCVLKGDGPVQGIINFEQKESNGPVKVWGSIKGLTEGLHGFHVHEFGDNTAGCTS AGPHFNPLSRKHGGPKDEERHVGDLGNVTADKDGVADVSIEDSVISLSGDHCIIGRTLVV HEKADDLGKGGNEESTKTGNAGSRLACGVIGIAQ
Notestranslation begins with Methionine in eukaryotes, therefore the position of translation is technically 5 (A -&gt; V). Protein language models train on amino acid sequences. Masked language modelling trains sequences by randomly masking some positions, and training the model to fill in the masked positions based on context. PepMLM does the same thing but to peptide binder design, where it masks all the peptides in the proteins you give, iteratively fills in those masked positions from most to least confident. Low perplexity = more confident, high perplexity = model uncertain.</description><generator>Hugo</generator><language>en</language><atom:link href="https://pages.htgaa.org/2026a/iman-karibzhanova/homework/week-05-hw-protein-design-part-ii/index.xml" rel="self" type="application/rss+xml"/></channel></rss>