[To contribute to this conference, send your message to [email protected]
For further information on the Electronic Forum on Biotechnology in Food and
Agriculture see Forum website.
Note, participants are
assumed to be speaking on their own behalf, unless they
Sent: 13 May 2003 11:27
To: '[email protected]'
Subject: 50: Detecting GE crop products - Labelling
This is Budi Prakoso. I am a lecturer in the Faculty of Agriculture, of
Sudirman University, Indonesia. I am currently at the Asian Institute of
Technology, Bangkok, Thailand, evaluating some detection methods for GMO.
I agree that regulation should be practical, low cost and can be
implemented. In terms of genetic engineering (GE), especially for developing
countries, politicians can make strict regulations on GE. But how is the law
1) Real-Time PCR machine, or even conventional PCR machine, is not available
in every provincial laboratory.
2) The cost of reagents and international reference standard materials is
3) There is no international standard material for every food product.
4) Detection of GE crop products in food is not as easy as in raw material.
5) Extraction of DNA from certain processed products is difficult or even
6) Food labelling policy differs in each county
I have evaluated Lateral Flow Strip (LFS), ELISA plate kit and PCR for
detecting Roundup Ready soybean in fermented food. The LFS failed to detect
it. The ELISA plate gave false positive result. The conventional PCR method
can detect the GE material in fermented food but cannot be used to quantify
the percentage of GE material in the food. I have not tried the Real-Time
PCR yet (I do not have the machine) but I do not think it will give an
accurate and precise results for quantification of GE percentage in any
Based on this fact, it is reasonable to make labeling in raw materials but
not in the food products.
Asian Institute of Technology
e-mail: bpd999870 (at) ait.ac.th
[Simply put, testing procedures for the presence of GM material may be
DNA-based (which, generally using the Polymerase Chain Reaction (PCR), look
for specific genes or DNA sequences genetically engineered into the
organism. Conventional PCR indicates the presence of GM DNA in a product,
while Real-Time DNA aims to quantify the amount) or Protein-based (which
look for the proteins produced by the introduced DNA (using e.g. Enzyme
Linked Immunosorbent Assay (ELISA) technology or Lateral Flow Strip
technology, a variation on the ELISA tests, using strips instead of wells to
detect presence of the protein...Moderator}.
Sent: 13 May 2003 14:48
To: '[email protected]'
Subject: 51: Oversimplification - traditional breeding and GE
[Reminder: Messages should not exceed 600 words...Moderator].
This is from Dick Richardson, United States.
I've watched with interest the comments so far. I will focus on related
technical points of agricultural crop breeding and the science of genetics.
I will relate these points to regulatory perspectives. For me, this is
intended as a rational contribution to the discussion, and I reveal my
active teaching and research context and how they skew my perspectives. I
apologize for any inaccuracies.
There is a tendency to consider cellular and organismal consequences of
genes moved around by hybridization equivalent to those introduced by
molecular techniques. They are neither functionally nor procedurally
equivalent. The mechanisms of how and why are actively being revealed.
Neither are the regulatory implications and consequences expected to be
equivalent as a result of the genomic organization in time and space.
The new understandings from molecular technological advances are clearly
reaffirming and extending the differences between these two modes of
changing genomes, and "simplifying assumptions" of equivalence cannot be
made in the face of current and increasing direct evidence to the contrary.
One presentation was recently published in the journal Science as an
historical review of DNA since Watson and Crick's work ("DNAs Cast of
Thousands", Science April 11 2003, pages 282-285) and a related discussion
on "counting" genes ("Defining Genes in the Genomics Era, pages 258-260 of
same issue) described the complexity of defining a functional "gene" in a
sequence of DNA, and a third is a book review in the same issue (reference
given below). All of them illustrate the large historical bodies of work
that tend to be disregarded until they are again examined with modern tools.
Regulators MUST take note of these well established bodies of genetic
knowledge and the profound way they affect our basic assumptions of risk and
remediation. The regulatory needs for effective protection of humans and
other species are remarkably different for genetically constructed novelty
using new technology. The traditional approach applies to chemical toxins
that cannot replicate themselves, but is grossly inappropriate for genes and
species that replicate themselves. Furthermore, the assumption of
"equivalence" between past genetic changes from breeding or natural
selection is tenuous in special cases but, in this larger context, it
becomes largely allegorical. As a basic scientist, I find this
oversimplified perspective shocking and uninformed. I'll give only examples
that I might use in a basic undergraduate genetics class for biology majors.
One example involves the gene regulatory role of chromatin (histone proteins
around which DNA is wound in the eukaryotic chromosome of higher plants and
animals). This role is discussed in a recent review of "Chromatin and Gene
Regulation" by Bryan M. Turner in Science (April 11, 2003, pages 252-254),
and summarizes how such simplistic assumptions fall short. It illustrates
how "old ideas" are revitalized with new data, after being "outdated" by
lack of unobservable details that begin to suggest mechanisms for the
original model. This is a problem when young geneticists or "crossover"
scientists may unwitting oversimplify limited observations with new tools
when they are unfamiliar with the history of research published more than a
decade to two previously. It seems that some of the most promising research
questions are re-examinations of those from earlier "generations" of
Referring to the historical reviews, we have come from DNA being highly
correlated with a heritable bacterial trait (1944), to DNA being a self
replicating polymer, thereby sharing a key property of genes (1953), to
triplet codes for amino acids that match the key polymeric structures (1961)
and mRNA and tRNA being the intermediary molecules for functional linkages
into protein synthesis (early 1960's, "one gene one protein"), to complexity
of the regulation that, with new data, enlarges the concepts to "one gene
several proteins", and also a class of genes for regulation of RNA
transcription by proteins binding DNA (1967) that are coded and produced
from other genes or modifications thereof. Now, regulatory RNAs coded into
DNA may be within a gene coding a protein. A more comprehensive definition
of a gene is a sequence of DNA coding a functional product (see the
"Defining Genes.." article). Genes no longer are perceived as beads on a
string each coding a different protein. One gene may be overlapping other
genes or included within other genes, and a region of DNA may code multiple
RNAs and proteins. Many genes are essential for the protein synthesis
function of any particular gene. Many more genes and conditions in the cell
are responsible for the regulation of a gene's activity. To characterize "a
gene" requires these interactions to be characterized. The base sequence of
a gene is a beginning, but no more than the prologue of a book.
Much of the last 40 years has revealed this complexity and interactions
among many genes, extracellular "imports" across the membranes and
metabolites to affect activities of a single gene. Now, with better
technology, we are seeing that there are secondary and tertiary phases,
notably those reactions that modify histones (scaffolding around which the
DNA is wound in a chromosome of nuclear DNA) which, in turn, controls DNA
activities. A "histone code" that finetunes DNA activity has been proposed,
adding even greater depth to genetic coding beyond the triplet code for
amino acids. (There is a family of genes that code the various histones.) We
have oversimplified the role of histones for decades, considering them
uninvolved in regulation of other genes, which now is known to be untrue.
Physical location and neighboring genes make a difference (observation) but
we don't understand much of the "how" (unknown processes that now are being
Next, we can consider the multiple proteins that can be coded into one gene.
The different immunoglobins are examples, and they relate to our immune
functions for protection to allergies to auto immune self destruction. Each
possibility is "selected" by regulatory processes that control where DNA is
modified, where RNA begins and ends synthesis on the DNA strand, overlapping
genes that share parts of the DNA, pieces of RNA extracted that can even
turn different genes "on" or "off" and other "transcription factors"
influencing sets of genes to form a veritable "symphony" of organized and
harmonized activities (which, if disrupted, have unpredictable effects, but
observable in a variety of manifestations).
If a student were so brash to claim that we know much about DNA and it's
functions, and thereby claim that we can be certain that a new gene(s)
cassette will have no unexpected effects, then I would suggest for them to
change major fields and have a different career than biology. These profound
areas of ignorance are what make biology, in general, and genetics, in
particular, such an exciting field of research. Particularly, it is exciting
because we now have new tools to probe the dynamics of the system(s). The
questions also suggest that we know very little about the way genes from
another distantly-related species will work, when created in combinations
with new regulatory formats, in a new genome, and in multiple possible
locations within the host's genome. The instability and uncontrolled
insertion sites of the gene cassettes used today are, indeed, well known.
The systemic effects are unknown, mostly untested, and thereby they all are
very interesting scientific questions. From current regulatory perspectives,
the questions are assumed to be nonexistent, or to have minimal practical
relevance, and so they are unworthy of examination. This is utter nonsense
if we read the flow of grant proposals up for review and possible funding
(presented in positive wording of discovery and significance, not the
negatives of "anti" GM).
Many of the discoveries come at the interface of different disciplines and
subdisciplines, among which are now nanotechnology and molecular biology.
Bryan Turner's "Chromatin and Gene Regulation" is a good place to see what
this means. We can rest assured that there are vast areas of ignorance to be
exposed just a year or a decade away, many of which will underscore relevant
dynamics about genetic engineering (not the "benefits" of the technology,
which are extrapolations of science fiction, which also can be exciting
Specifically, we clearly are dealing with complex functions and effects of
genes operating in specific immediate environments in a cell. When genetic
organizational changes occur in hybridization, a complex system is involved
in moving linked blocks of genes on a chromosome to new but similar places
(crossing over) or a set of chromosomes into new genomic sets
(polyploidization). The organization and integrity of the chromosome has
been recognized from the first half of the 20th century (e.g. Richard
Goldschmidt's work on sex determination, discussed in his book, Theoretical
Genetics, 1955) where he considered an entire chromosome to be an integrated
organelle. Today's molecular research confirms and extends many of
Goldschmidt's basic ideas. In contrast to hybridization, a "shotgun"
insertion or viral (infectious) insertion of a DNA segment cannot be assumed
to be neutral or equivalent to any normal cellular process until
If we follow the scientific principle of using simple explanations over more
complex ones until the observations demonstrate the need for complexity
(Principle of Parsimony), we should assume that new and unpredictable
effects will occur until convincing evidence is available supporting a model
of insignificant disruptive effects of one gene on other genes' functions.
"Beneficial" or "detrimental" are terms that cannot be equated with "pest
resistance" or "drought resistance" or "salt tolerance" until we know the
rest of the genetic interactions. No gene adequately studied is known (to my
knowledge) that has only one such phenotypic manifestation. From basic
enzymology we know that any gene for an enzyme in a metabolic pathway
affects many other steps in the pathway when it is differentially regulated.
Furthermore, some of these genes are known to also function in developmental
pathways as well. In addition, the new synthetic cassette insertions
frequently are unstable, presenting new challenges of characterizing a newly
formed cassette in multiple genomic "habitats." Fortunately, such evidence
of many changes can be obtained using existing molecular technology
(proteonomics, in particular). In time, I suspect we'll see patterns that
help find reliable shortcuts, both in construction of novel cassettes and
integration into the genome and chromosome. However, until the complexity is
taken as a "given" we are operating with the equivalent of the "flat earth"
model of navigation.
From a regulatory perspective, risk assessment is a balance between profits
and potential costs of damages borne/internalized by the perpetrator
(manufacturer and marketers) or society or individuals (externalized). It
has taken a long time for US courts to begin financial corrections to
internalize a part of the costs of smoking tobacco to the tobacco industry
for health hazards that previously were resisted by regulatory "favors"
undemocratically requested by the industry. The difference between GE and
health effects of smoking cigarettes is profound - the self replication of
genetic elements continues any flow of costs and may be unstoppable,
resulting in vastly increased costs. In this context, we have the ecological
disasters of "invasive species" as a guide for risks of this type. There is
no projected end of the damages being caused, and much greater care is
necessary for regulatory protection than has characterized past issues of
health and environmental damages by chemical toxins that may decompose or
become diluted in time. The ecosystem extends from the genome to the
community level, and into the future with no known limit. There is no "super
fund cleanup" option to mitigate the damage, once initiated. Asymptotically,
the entire cost is borne by individuals and societies far into the future,
possibly at increasing intensities and varieties of effects. Externalized
costs becomes absolute.
What is the regulatory framework appropriate for such non-reversible
potential damages that have incalculable economic costs. The more complex
social injustices and environmental damages are also inconceivable, although
even more important in the long run than economic effects can evaluate. Past
experience with introduction of new species into new habitats suggests a
need for new models of input from informed publics to guide regulatory
agencies approach for such issues, and include effective insulation of the
agency functions from influence by strong special interests. For me, a
regulatory framework needs to include at least the following aspects for GE
products, in contrast to those potential damages that diminish in time and
- What justification can be made for haste and superficial evaluation of
- Have alternative solutions that cost little with less (and different) risk
been examined for perceived benefits?
- Who benefits, and who takes the risks and pays whatever costs there may
The University of Texas at Austin
1 University Station C0930
Austin, Texas 78712-0253
d.richardson (at) mail.utexas.edu