DeepPrime is a deep-learning-based prime editing efficiency prediction tool developed in Laboratory of Genome Editing, Yonsei University.
It greatly expands upon the previous start-of-the-art pegRNA activity prediction model, DeepPE, which was limited to a specific set of edit type and length combinations.
DeepPrime was trained using 259K pegRNAs with Primer binding site (PBS) lengths ranging from 1 to 17bp, Reverse Transcription template (RTT) lengths ranging from 1 to 50bp, Edit positions ranging from +1 to +30, and editing lengths ranging from 1 to 3nt.
Specific rights, obligations, and restrictions apply to each Academic License
For commercial use of our computational models, please contact Hyosun Lee (hyosunli@yuhs.ac) or Prof. Hyongbum (Henry) Kim (hkim1@yuhs.ac).