The ’80s and ’90s: facing frictional forces

Formats the Html codes.FormatFix the Html to be XHtml compliantFix Write Validate that the text is XHtml compliant.Validate Proofing

Frictional force refers to the force generated by two surfaces that contact and slide against each other; the intensity of this force is affected by surface texture, angle and position, as well as the amount of force impelling them together.

Despite its proven value to the research community, including ongoing support from highly respected scientists such as Howard Schachman at the University of California at Berkley, AUC began to lose much of its luster by the early 1980s.

Exacerbating this problem were three “frictional forces.”

First was a growing belief that AUC was too difficult to perform for many routine experiments, and instead was valuable only for highly purified systems, and only to researchers seeking a deep chemical understanding of a solution.16

Consequently, many biochemists and molecular biologists started determining molecular weights using simpler, less costly techniques such as gel permeation chromatography and gel electrophoresis. What’s more, crystallography and nuclear magnetic resonance (NMR) were beginning to gain more attention—and, as a consequence, less attention was given to AUC.9 In fact, it wasn’t unusual for textbooks published in the 1980s to completely drop sections on the topic of analytical ultracentrifugation.

Proponents argued that AUC was easier to perform than other routine methods of molecular biology—and provided thorough, easy-to-understand information about molecules in solution, even without detailed quantitative analysis. An unexpectedly fast or slow boundary, for instance, could provide the insight necessary to interpret confusing results obtained from other experimental methods.16

The second frictional force was closely related to the first.

Notwithstanding the groundbreaking work that had been done by Fujita in the previous two decades, analysis of the massive amounts of data generated by AUC was still a complex and time-consuming undertaking. Extracting sedimentation coefficients, translational diffusion coefficients, frictional coefficients and buoyant mass from sedimentation velocity data was difficult—particularly from complicated mixtures—because simple solutions to the Lamm equation analysis did not yet exist. 16

Third, overall interest in the biophysical chemistry of biopolymers had begun to wane in favor of emerging developments in molecular biology and recombinant technology.11

If the 1950s through the ’70s was a “boom time” for AUC, it seemed the boom was about to end.

What Goes Around Comes Around—At 60,000 RPM

The Beckman XL-A analytical ultracentrifuge

Actually, it didn’t take long for it to become clear that the analytical ultracentrifuge was not destined to suffer the same fate as the rotary phone and the eight-track tape.

By the mid-1980s, it was apparent that NMR (in high-resolution mode) and crystallography could be applied to only a limited number of biological macromolecules. Furthermore, gel filtration and gel electrophoresis techniques had been shown to be much less reliable than AUC for getting precise molecular weights—and precision molecular weight data was crucial for evaluating the subunit composition of macromolecule assemblies.9

And ironically, by the 1990s, some of the same advances in molecular biology and biotechnology that had seemed on the verge of relegating AUC to the dustbin of scientific history—such as research on biotherapeutic proteins—require an even stricter understanding of the physical interactions of biopolymers and nanoparticles.

As a result, interest in AUC began to rebound.

This “AUC renaissance” was greatly accelerated in 1992 with the introduction of a game-changing new analytical ultracentrifuge from Beckman Instruments—the XL-A (now called the ProteomeLab XL-A and manufactured by Beckman Coulter Life Sciences).

With all the analytical power of a Spinco Model E but a design that resembled the Model L, the XL-A was compact and easy to operate. And unlike earlier models, rotor speed, temperature and data acquisition had all been computerized.

Now experiments lasting from several hours to many days could be performed with minimal operator intervention—and data could be viewed and analyzed in real time as the experiment progressed.12

Hearkening back to the days when Howard Schachman would call Ed Pickels at Spinco to suggest refinements, the new XL-A featured an absorbance optical system and a scanning monochromator that enabled measurement of sample concentration at wavelengths from 190 - 800 nm.

Rayleigh interference optics were later added to the ProteomeLab XL-A, creating an instrument (the ProteomeLab XL-I) that could simultaneously record data with both types of optical systems.

As Schachman and Pickels had learned decades earlier, each optical system has certain advantages and disadvantages. Absorption optics are particularly useful for experiments involving nucleic acids, and for detection of any macromolecules containing strong chromophores.

In contrast, the interference optical system is superior for analyzing macromolecules lacking intense chromophores (e.g., polysaccharides), as well as samples that contain strongly absorbing buffer components. It’s also the optical system of choice for characterizing highly concentrated samples.

Regardless of which optical system would be used, the XL-A and XL-I ultracentrifuges represented an extraordinary advancement in the history of AUC.

In the 1960s, measuring molecular weights and characterizing association patterns for a simple protein that self-associates could take several weeks, even in the most well-equipped laboratory. Now such measurements could be done routinely in a few days.

So instead of being considered an out-of-date method, AUC in the 1990s was becoming the method of choice for elucidating a growing array of properties of macromolecules in solution, including:

  • Size
  • Size distribution
  • Purity
  • Gross conformation
  • Thermodynamic nonideality (including virial and activity coefficients)
  • Equilibrium constants for self-association
  • Ligand binding and binding to other macromolecules
  • Stability of macromolecular complexes
  • Mechanisms of self-assembly
  • Characterization of the structure of gels and macromolecular network structures

The technology conceived by Théodor Svedberg in the ’20s, refined by Beams and Pickels in the ’40s and invested in by visionary Arnold Beckman in the ’50s was once again making a serious contribution to life sciences research worldwide.

Well, mostly serious.

It was also about this time that long-time AUC paramour Howard Schachman, by then a respected research biochemist and professor of the graduate school at the University of California at Berkeley, came up with an interesting way to demonstrate what he was learning from AUC about the stability of dimers and trimers.

According to Schachman, who was known for his impish sense of humor, his “monkey models” became a popular part of the literature at the time, and were prominently featured in his lecture for the Harvey Society at Rockefeller University. 14 

It wasn’t one of Schachman’s monkeys, however, that would drive the next substantial movement forward for AUC.
It was a Lamm.

9 Harding SE. Analytical Ultracentrifugation and the genetic engineering of macromolecules. Biotechnol Genet Eng Rev 1993;11:317-356.
12Cole JL, Hansen JC. Analytical Ultracentrifugation as a contemporary biomolecular research tool. Journ Biomolec Tech 1999;10:163–176.
14 Howard Schachman, “University of California Professor of Molecular Biology: Discussions of His Research Over His Scientific Career From the 1940s Until 2010,” conducted by Sondra Schlesinger between 2007 and 2010, Regional Oral History Office, The Bancroft Library, University of California, Berkeley, 2010.
16 Laue T. Analytical ultracentrifugation: a powerful ‘new’ technology in drug discovery. Drug Discovery Today: Technologies 2004;1(3):309-315.

Talk To An Expert