Tuesday, October 25, 2011

Coping, just

 Too late to get upset over anything now, hang on

Monday, October 3, 2011

Site-Directed Mutagenesis Redux

My current research project involves introducing single-base mutations as well as in frame deletions to a human gene and study their effects on the protein product. My supervisor was kind enough to purchase a rather expensive commercial kit for the purpose.

So we went ahead with the kit. The positive controls worked brilliantly, so did the competent cells that came with the kit. However every experiment we did failed; not even a single colony formed.

The next two months were spent in frustration and confusion. We tried many suggestions from various websites as well as colleagues, none of which worked. The situation seemed especially bleak at one point when the kit ran out and we had to wait for weeks for more material to arrive while doing nothing since we don't have the right constructs to work with.

Jumping ahead to the present day, we were able to achieve the goal by doing it the hard way. Looking back at our past attempts, a lot of the "helpful hints" we got from the internet were wrong; senior lab members at my university, while more experienced, did not necessarily understand the mechanism of mutagenesis either so their advice were of little help (one technician simply suggested blasting through with large doses of every ingredient, seems like he got lucky with his way on every occasion). In addition there were too many other issues with our methodology and record-keeping that lead us into many dead ends; our sterile techniques were also less than perfect that we wasted much time and resource fooled by several cases of false positive results due to contamination, etc etc.

So what went wrong? Let's begin with a short summary of various historical and contemporary approaches to site-directed mutagenesis.


The first site-directed mutagenesis methoddescribed by Hutchison et al was published in 1978. In essence the method involves the synthesis of a short oligonucleotide primer containing the mutated sequence, which is hybridised with the wild type ssDNA and extended with DNA polymerase before transformation into E coli.

The protocol did not eliminate the template and hence less than 20% of progeny will retain the mutation, requiring tedious selection, a major pain back in the 1970s with DNA sequencing in its infancy and Kary Mullis a few trips away from inventing PCR. It is not surprising that some better funded and less patient labs resorted to have the mutant gene completely synthesised at great expense or in a less extreme version known as cassette mutagenesis, partially synthesised with suitable cloning sites and inserted into the vector by ligation.

Annotated diagram of found on the internet

Fortunately for the less privileged, many improved protocols soon emerged to increase mutant yield and reduce the burden of selection. One of the most significant advancement was developed by Kunkel et al., in which...Oh well I'd be lazy for once and quote wikipedia directly:

"The plasmid to be mutated is transformed into an E. coli strain deficient in two enzymes, UTPase and uracil deglycosidase. The UTPase deficiency prevents the breakdown of UTP, a nucleotide that normally replaces dTTP in RNA, resulting in an abundance of UTP; the uracil deglycosidase deficiency prevents the removal of UTP from newly-synthesized DNA. As the double-mutant E. coli replicates the transformed plasmid, its enzymatic machinery incorporates UTP, resulting in a distinguishable copy. This copy is extracted, and then incubated with the Klenow fragment, dNTPs, DNA ligase, and an oligonucleotide containing the desired mutation, which attaches by base pairing to the complementary wild type gene sequence. The ensuing reaction replicates the UTP-containing plasmid using the oligonucleotide as primer, thus incorporating the desired mutation. This forms a chimeric plasmid, with one strand unmutated and containing UTP, and the other strand mutated and containing dTTP. When this plasmid is transformed into an E. coli strain with normal UTPase and uracil deglycosidase, the UTP-containing strand is broken down, whereas the mutation-containing strand is replicated, forming a plasmid lacking UTP but containing the desired mutation on both strands."

This method is seldom used nowadays because there are easier and better ways out there, nevertheless the idea of template elimination took hold and contributed the basis of all mutagenesis strategies used today.

Probably through pure coincidence, dam methylation of E coli was also characterised in the 70s. The neat system provides a simple way to distinguish daughter strands following replication through the lack of methylation on certain adenine bases. The daughter strand is then selectively removed by a methylation sensitive enzyme such as endonuclease MutH. A restriction enzyme called DpnI also acts on the same site and specifically digest the unmethylated strand. It did not take much imagination to realise that DpnI is the perfect tool to selectively remove the template without affecting unmethylated DNA generated in vitro.

Yet the easiest way to synthesise DNA, namely PCR, is riddled with random errors that it was limited to small fragments. Yields with the more commonly used T4 DNA polymerase were poor and selection was still carried out via phenotype or special vectors for the next decade. Finally the hurdle was removed with discovery of proof-reading thermostable DNA polymerases, which allowed the direct use of any dsDNA with thermal cycling, greatly reducing the complexity of DNA amplification.

Thanks to technology, most commercial "fast" mutagenesis kits use thermal cycling followed by DpnI digestion to remove the template before transforming them to cells on the same day, yet their internal mechanism may be more different than you think.

The simplest implementation popularised by Stratagene is known as Quickchange. Two overlapping primers containing the desired mutation are used with a proof-reading DNA polymerase to generate nicked plasmids; DpnI is added to remove methylated/hemimethylated source DNA, leaving behind only the mutated strands to transform E coli where the nicks are repaired.

 Strategene's own (idealised) illustration

Theoretically the protocol is fast, easy and almost foolproof provided that your enzymes have not expired, yet in practice people (yours truly included) frequently run into all sorts to bizzare problems. The troubleshooting guide supplied with the kit point fingers in every possibly direction: Too much or too little template, bad plasmids, primer dimers, poorly controlled cycling, not enough respect for their super awesome competent cells (I am quoting verbatim) blah blah blah without a single mention of the biggest problem: excessive 5'->3' extension

In a normal PCR, extension requires little attention as long as there is enough time for the enzyme to finish building the new strand. Any initial overshoot will be corrected in the subsequent cycles by the other primer. However with a circular template and overlapping primers, the conditions of extension must be optimised to prevent the enzymes from displacing its own product and continuing to extend down the 3' end. The product will not form a nicked circle and hardly transforms into E coli if any. No polymerase is immune to the problem and the likelihood increases with each cycle as primers become used up, leaving the template free to anneal with the product. Unlike the nicked circles, over-extended ssDNA can act as DNA template further depleting primers otherwise available for correct binding. (Nicked circles cannot act as template for synthesis as the strand break is immediately downstream from the primer binding site)

Confusingly, many people report better results with longer extension and some versions of Stratagene's manual recommend 2min/kb extension instead of 1min/kb. Attempts to heed to these suggestions never worked, however the results did help identify the problem. I can't find the rather dramatic gel photo now but the 12-hour long amplification produced some extremely long ssDNA that would not even migrate in 0.7% agarose gel electrophoresis. Instead it formed a very bright halo around the wells that took a number of head-banging to figure out what is going on.

In the end we got it working like I said before. Instead of tinkering with the 9kb entry vector, the gene was shuttled into a smaller vector (3.5kb including the insert) which required a much shorter extension time and less room for error. Fortunately our gene of interest was maintained in a Gateway vector which made the process much less of an ordeal but others may not be so lucky.

If your insert is long or subcloning is difficult:
  • Optimise your PCR conditions: reduce cycle number, decrease extension time and temperature, try a gradient of annealing temperatures, change template/primer ratio, increase pH of the reaction buffer, use additives such as DMSO (I add 4% for every PCR involving GC-rich genes, some difficult template may require as high as 10%) or betaine.
  • Buy a specialised polymerase (variously referred to as "non-strand replacing" or "with DNA clamp") that is less likely to have issues with extension. PfuUltra seems to be a popular choice. A new product from Takara known as PrimeStar Max boasts an extension time of 5s/kb; a mutagenesis protocol tailored for the enzyme uses very small amount of template and skips DpnI digest entirely as it is easily outnumbered by products after an amazing 30 cycles.
  • Redesign your primer to be partially overlapping. These primers are less likely to anneal to each other and may allow you to get away with a lower template concentration. I get a small yet noticable increase in colony numbers after I redesigned my primers with a short 3' overhang. 
  • Buy 5' phosphorylated primers and do a ligation after DpnI digestion. This might help stablising the double strand for some long mutations that does not like to form nicked circles.
Invitrogen's GeneArt kit, on the other hand, is designed with the problem in mind: Recommended extension time is halved and an additional in vitro recombinase reaction is carried out to convert the inevitable linear product to circular DNA before transforming a special cell type that automatically degrades the template. I have not use this personally however it looks pretty neat; price is on par with the Stratagene kit but again, both are really, really overpriced.

A better system offered by Finnzyme known as Phusion uses an alternative strategy. Instead of using a pair of overlapping primers going in circles, it uses two "back to back" primers that amplifies away from the mutation site to form a linear strand which is subsequently phosphorylated and ligated before transformation.

Compared to Quickchange, it is much easier to use because
  1. The PCR product is linear and can be easily checked by gel electrophoresis; Quickchange usually ends with a smear leaving you wondering if it actually worked.
  2. Ligated DNA has better transformation efficiency
  3. No need for specialised enzymes and protocols, just follow the manual.
except two major drawbacks:
  1. The primers needs to be highly purified, e.g. HPLC or PAGE. The reason is that DNA oligos are synthesised from 3' to 5' and there are always some molecules with one or more bases missing from the business end and in our case, result in unwanted deletions. Extra purification and can cost a lot is the primers are long and negate the benefits of the method.
  2. Ligation of blunt-end strands usually requires overnight incubation, so one more day is needed before the DNA is ready for sequencing. 
On the other hand the system is rather fool-proof given you had good material to work with. For a detailed explanation go here

Dill's Refined Quickchange Mutagenesis Protocol

Before you start:
  1. Check your template by digesting it with DpnI and run a gel alongside the undigested plasmid. This step is not mentioned anywhere else but highly recommended because it establishes a few things that often go wrong: Your DpnI is active, your template is sufficiently pure, supercoiled and methylated. 
  2. If your plasmid is stored in TE buffer, consider ethanol precipitation and re-dissolve in water as EDTA can affect polymerase activity, unless your template is very concentrated and only added in small quantities. 
  3. Consult the documentations of your polymerase to work out the best reaction conditions. You often need more polymerase than normal to get the best yield. 100ng of template and 150ng of each primer appears to be a good starting point for point mutations on plasmids from 3kb to 6kb; outside these ranges you might need to experiment to find the best molar ratio.
  4. Design your primers. There are many way to do this and the best strategy is to use Takara's protocol as a starting point and apply common sense: Add a few more bases when in doubt, especially on the 5' end; try to end your primers on a C or G; avoid repetitive sequences that self-anneals. Strategene's recommendations and their web-based tool should be taken with a grain of salt. Melting temperature is not critical unless you are using some picky polymerase that you should not have bought in the first place; if you absolutely have to, use the nearest neighbor method on the non-mutagenic portions as a guide. Desalted or cartridge purified primers are fine in most cases, and the money is always better spent on a few extra bases on 5' end than more expensive purification steps.
  5. Get some competent cells with competency of 10^8 or higher. HsdR genotype is preferred since your DNA will be entirely unmethylated.
  1. Set up protocols according to the polymerase used.
  2. If there is a final extension step, don't include it.
  3. Do not exceed the extension time specified for your enzyme.
  4. 15-18 cycles should be enough, do not exceed 20 cycles.
Following that:
  1. Add excess amount of DpnI, usually 10U per 50uL reaction but feel free to add more if have a lot of template. Not to mention the PCR buffer is usually not optimal for DpnI activity
  2. Vortex and centrifuge the tube to ensure good mixing. Incubate at 37C for 1-2 hours in a PCR machine with heated lid or cover with mineral oil to prevent evaporation. Vortex/spin at least once during incubation. 
  3. (Optional) Add Proteinase K and incubate to stop the reaction. Do not heat inactivate since this might interfere with strand pairing.
  4. (Highly optional) Phosphorylate purified product and ligate with T4 DNA ligase. CSL recommends this step for all applications but I feel this is only for long mutations that you should not have used a Quickchange protocol to begin with. 
  5. (Electroporation only) Desalt the product if your apparatus is prone to arching; chemically competent cells are better for the purpose.
  6. Add ligation product to competent cells, heat shock/electroporate , add SOC and shake at 37C for 2 hours instead of the usual 1 hour to allow cells to repair the plasmid. This is more important if your selection antibiotic is bacteriocidal such as kanamycin or streptomycin. 
  7. Plate cells on appropriate selective plates. A good reaction should result in tens to hundreds of colonies. 
  8. Pick 3-4 well-spaced colonies, grow them up, miniprep plasmids and sequence.
Common Issues:
  1. For unknown reasons Quickchange sometimes results in random insertions. Nevertheless there should be at least one colony with the desired mutation and no other errors.
  2. If there is no colony at all, consider transforming with more products and plate a higher volume. I make my agar plates in advance and store them in the fridge without a bag. After one week they would have lost some of their water content and as a result up to 400ul can be spread easily. 
  3. If no clones can be found after you screen 5-8 colonies, it is very likely that you have had contamination or template carry-over and you should start over and be more careful.
  4. The efficiency of the reaction can be accessed by blue-while screening compatible cell strains and plasmids. Normally more than 90% of colonies should be mutants - if not there is likely some issues with amplification/digestion. 
Good luck, and let me know your sucesses/questions on twitter @DillADH

New Kindle for a New World

Yup, the new Kindles are here and the prices are better than I ever thought. For those in the US you can get a subsidised e-reader for as little as $79. For the same price you get your choice of the new namesake Kindle Touch or the same old Kindle 3. A steep discount up to $40 (already applied in the image above) are available for those who sold their souls agrees to receive and view paid advertising while they are not reading.

The new kindles reminds me strongly of their Sony counterparts with silver-ish covers and minimalistic design. Heck, the touch version does not even come with physical page turning buttons.

While the display quality of the first Kindle left plenty of things to desire for, it is possibly the most ergonomic Kindle to this day. The subsequent Kindles saw a gradual improvement to everything except to the buttons (not including K3's five way controller which I am fond of) with the keyboard on the two Kindle DX models bordering the realm of uselessness. Now with the touch version they completely did away with physical keyboard.

More virtual QWERTY keyboard, you must be kidding

If you ever tried to type anything longer than a short email on any modern tablet you will understand my frustrations: They are simply painful to type on. Qwerty keyboard was designed for physical keys, not glass surfaces with no tactile feedback.

I am already sick of touchscreens which seems to find its way into everything between the space shuttle and the common refrigerator. Before 2007 it would be outrageous to sell something without physical keys, now the reverse seems to be true. 

Rant is over, let's get back to the topic.

Preliminary teardowns suggest that the lowest priced Kindle came with a Cortex-A8 based SoC, beat that Nokia. However the RAM and battery capacity has all been sliced by a half to reduce the overall cost.

Do I have any desire to upgrade my current complement of reading devices(consisting of a Kindle DX International, a Kindle, er, Keyboard 3G not to mention smartphones and computers with Kindle clients)? Well not really. The hardware on the entry level model is rather limiting to be an upgrade; the CPU bump is a nice touch but the halved RAM killed it for me. The smaller battery, while drawing much criticism, should not be a serious issue in this day of age when we have already accustomed to charge our gadgets once per day. I don't see much point in a touch operated e-reader let alone a multimedia tablet. Egadget has summed my opinions rather eloquently in an earlier post that in essence, nobody apart from the tetraplegic really needs a tablet. The current craze for bigger screen size and touch gestures is nothing more than a invention like the bunch of merchants of Edinburgh who invented the myth of Highlander culture in the 18th century.


To its saving grace, Amazon had a sensible grasp on the best use for tablets: an advanced entertainment slate by stripping all the purported productivity features. In any case, everybody agrees that the Kindle Fire is the android twin of RIM's playbook soon to be forgotten.

Like it or not, Amazon is the internet's upcoming Wal-Mart and the more likely entity to change everything again about the way we read. I can already see Kindles being given for free in exchange of a certain amount of book purchases or Amazon Prime subscriptions, wait, I have seen something like this already....

Enters awkward opera, my favourite rendition of Der Hoelle Rache by the venerable Diana Damrau:

P.S. If you own a Kindle 2/DX like myself it is highly recommended that you flash  Yifan Lu's hacked 3.1 firmware. You lose a few unimportant functions such as active content and TTS but get greatly improved reading experience, so check it out. If you don't have access to a K3 to extract the files, tweet me @DillADH and I will help you.