Evolution is the unique property of living organisms; no living organism can escape it. The essence of evolution is the constant production of variation. Then, drifting and selection within a population control the size of variation. Like the flowing water in the river, nothing is static, even if it appears stationary.
Not only living organisms but some inanimate are recognized as evolvable systems, such as neural networks, logistics and the internet. What is the property of the evolvable systems? The evolvable systems can generate ‘variation.’ The evolvable systems are robust and adaptable against environmental changes because of the variation. How do they make the variation? What is the requirement for the system to become evolvable? I identify the two fundamental mechanisms of evolvable systems: 1. Replication with errors (the replication system) and 2. A lot of detours within a network (the detour network system). The fundamental difference between the two is that the replication system creates variation in a population while the detour network system creates it within the network. Either system alone appears to be sufficient for evolvability. Interestingly, live organisms use both. To my knowledge, only live organisms, including cancer cells, use both. As far as I know, they are the only two mechanisms showing evolvability.
If something is replicated (or duplicated) with 100% fidelity, no variation is produced. Thus, no evolution takes place. Imagine an article written in an unknown language with unknown letters. You try to replicate it with your handwriting. Maybe you recognize lines and patterns, but nothing makes sense to you. You intend to make no error, but probably errors are unavoidable. The first trial, second, third and so on. The errors you make likely differ in each trial. There might be sections where you always make errors because the letters are highly complicated or repetitive, but you have no skill for validating it by reading. All you can do is make careful visual comparisons. These become the first-generation copies; all would be different from each other and from the original. Then, the second-generation copies are created based on the first-generation ones. A unique set of errors is introduced again in addition to those already in the first generation. After several generations, the variation of the article will be huge. I cannot predict what portion of the articles is still legible in the original language and whether the original message in the first original article can be retained. Printing and photocopying overcame these issues, but they degrade with age. The current digital articles do not degrade. Therefore, there is no variation. Thus, no evolution.
We see the same thing in the DNA sequence of living organisms. Each round of DNA replication introduces errors. Thus, the variation of DNA sequence increases each time. Some sequence patterns, like repetitive sequences, would cause more frequent errors than others. The conflict between DNA replication and transcription is also a big topic these days as the cause of frequent errors. In each round of replication, a cell gains a new set of errors but can be unnoticeable in most cases in cellular morphology and physiology. Indistinguishable in phenotypes but constantly changing. The replication system with errors always and inevitably produces the variation in the progeny.
The second evolvable system is the complex network system. Imagine a big city like NY. There are a lot of routes to take you from place A to place B—a lot of detours. The shortest route would exist that you are most frequently using. But even if the shortest route is blocked due to an accident, you can reach place B. Your options can be infinite if you are okay with a longer route. No single street/corner is essential. By adding new streets and corners, the shortest route may not become shorter, but the variation of detours always increases. Therefore, the robustness to travel from place A to place B is increased. You can also call this condition as the route choice is plastic and adaptable to unpredictable accidents.
On the other hand, if a selective pressure like travelling time or travelling cost is applied, the shortest or cheapest would be more frequently chosen. Not all routes are equally chosen, but almost all routes are available for everything. Interestingly, the shortest is highly context-dependent in real life, not only the physical distance. Maybe some routes have frequent traffic jams, constructions or accidents. Depending on the day, time, weather, or season, physically ‘shorter’ does not mean ‘faster,’ ‘easier,’ or ‘more convenient.’
Live organisms are a replication system embedded within the detour network system that consists of nested networks of biochemical reactions with many detours. A cell is a minimum unit of life. It is encapsulated with a sac of the plasma membrane. Within the sac, various biochemical reactions take place and create many cascades and cycles. Each network has a major path with many detours. This is similar to traffic and logistic systems within a city. Major players that control biochemical reactions are proteins. Self-assembly reactions of protein-protein and protein-substrate catalytic interactions orchestrate various cellular activities and functions. With no intention and purpose, the simple thermodynamics rules of the self-assembly property and energy equilibrium drive the reactions.
How did these cascades of organized biochemical reactions emerge in a cell? In other words, how did life start? Initially, nothing is organized, and there must be many spontaneous random reactions. Chaos of random, non-connected reactions. Everything required to support the first cell must exist, but nothing is connected as cascades. There are tons of random ‘dots and connections.’
Constraining diffusion is critical. Since all biochemical reactions are regulated by affinity, kinetics and thermodynamic equilibration, the concentration of molecules involved in a reaction is essential. Keeping everything in a small compartment to prevent diffusion outside allows even low affinity and slow kinetic reactions possible.
Imagine a remote island like the one in “Cast Away.” Everything necessary to sustain your life is available on the island, but you have no idea where to find it, such as food, water, clothes, tools, etc. What you can do is wander around to find them one by one. The route you find may not be the shortest, but it does not matter. Even if your way is highly inefficient, the situation is much better than not finding it. This allows you to survive. At the beginning, each trip is just one destination. But you may make a mistake to follow the correct route or maybe simply curious, you would explore outside of the route. By chance, you may find shortcuts, other routes or locations for other essentials. More wandering around has more opportunities to find other unknowns. You start building a network of routes and locations. Maybe your one trip can start covering all destinations. Even if a landslide blocks one path, you could find detours if your network is complex enough.
Interestingly, however, too many detour options would compromise the efficiency of the network. You have a risk of making wrong turns and may get lost. Interestingly, although a random complex network is necessary to find the shortest route, once the shortest (efficient) route is identified, the randomness negatively impacts the efficiency and better to be suppressed (trimming). On the other hand, focusing on efficiency often compromises robustness despite robustness being much more critical for survival.
Imagine a small or large island or an island missing water or food. Without wandering around, you never know what is available. You hope everything is available and can be discovered. You only know that as the retrospective consequence based on ‘death or alive.’ I want to emphasize from this analogy that the island's compositions are luck, outside of your control. Survival is not the consequence of the logically planned bottoms-up strategy. But survival depends on everything on the island (although you have no idea what will be useful prospectively) and random bumping within a limited space, followed by trimming. The network created on the island is the retrospective consequence of survival.
In human societies, we build something new from bottoms-up using logic (i.e. engineering). All pieces within a new should “make sense” with their roles. Contrarily, a new in nature emerges from chaos to order. We try to understand all pieces within it as “make sense” of their presence based on their roles. The fundamental difference between the two is that the latter is always retrospective. Only humans can set prospective goals and purposes. Nature cannot create either goal or purpose—just continuation. The teleological view in biology is incorrect. Something that looks like a goal or purpose is all retrospective for our satisfaction to feel as “makes sense.”
The dots of ‘dots and connections’ in a cell are metabolites, nucleic acids and proteins. The connections are physical interactions between two random molecules. DNA, RNA and proteins are polymers of nucleic acids and amino acids, respectively. Their polymerizing reactions are self-assembled reactions that do not need an additional energy source, such as ATP. They can self-polymerize. The order of nucleic acids or amino acids within each polymer is random. Proteins are a polymer of amino acids consisting of tandemly connected multiple peptides. Each peptide creates unique multiple 3D interfaces based on its amino acid sequence. For biochemical reactions, the 3D interface of a peptide is essential for catalyzing a reaction, not the amino-acid sequence information itself. By simply changing their sequences, a polymer of amino acids can make various 3D interfaces. For example, something abundant (molecule A) in the environment is catalyzed to something else (molecule B) by an activity of the 3D interface of a randomly created peptide. This first something else (molecule B) becomes the second something else (molecule C) by another peptide, and so on. A cascade of chemical reactions could happen, particularly when the diffusion is constrained. A cascade or two may emerge accidentally by many random peptides within a sac. Nothing needs to be high-affinity or efficient. Crowdedness within the sac and the variation of 3D interfaces would be important for the chance of bumping two random molecules to test if any reaction happens.
The biological system has never been built with the bottoms-up logic. Instead, it was constructed accidentally by ‘mix and match’ and ‘pick and choose,’ whatever is available to keep replicating a minimal biological unit, a cell. Primitive pre-life must be a sac for many cascades of chemical reactions. They are not alive because they cannot replicate themselves. But each sac can be a complex biochemical network with many detours working as the whole. All chemical reactions are based on the 3D interface of peptides created by random amino-acid sequences.
On the other hand, DNA is an amazingly convenient polymer forming the double-strand helix structure consisting of just two pairs of complementary nucleotides. By heating up, the double helix gets disentangled into each strand. By cooling down, each single strand can return to form a double strand. Or if more nucleotides with a catalytic enzyme are available, it is possible to synthesize a second strand using the original single strand as a template. Interestingly, the DNA polymerizing process does not require extra energy, ATP, while DNA helicase does, that is, unwinding the DNA double strand. This is very interesting. This is one reason it has been assumed that early life might have emerged in heat convection like ocean hot-springs where heat can take care of unwinding.
DNA polymers are a replicable system. Peptides, polymers of amino acids, are catalytic molecules that can build a network of biochemical reactions. RNA polymers are connecting the two with the code, triplet codons. By this code and RNA polymers, the sequence of amino acids in a peptide is tightly linked with the sequence of a DNA polymer that can replicate. Now, it is possible to make the same peptide.
Various combinations of everything available were brought in a sac of lipid bilayers. Like creating chemical chaos in a sac. Perhaps there were tons of sacs different in size and composition. Some sacs sometimes fuse, or some can split. Just keep repeating, mixing various molecules. Remember the first appearance of the iPad? All electric pieces, consisting of the iPad, were available. The vision of Steve Jobs about a portable tablet was the critical driving force. Probably, many prototypes were created and abandoned. At the beginning of life, mix & match, pick & choose from ancient, random, chaotic organic chemical soaps. Lipid bilayers form sacs to work as a barrier for diffusion. Cycles of dry and wet may create condensation of components fusion/fission of sacs. Many different patterns of combinations must be tried. Most did not work like a lot of prototypes never appear on markets.
I have no idea how many times successful life-like creatures were made and if all used the same DNA code. All we can say is that all living organisms use the same DNA code. If the random mix & match works for replication, it continues. Nothing needs to be efficient. Just anything should not disturb the replication ability. Like crystallization, it can keep replicating. Now, the unit of replication emerges, which is a network of biochemical reactions with DNA replication in a sac called a cell. Simultaneously, a cell is the unit of selection. Only the cell is the selectable replicable unit as a whole. Conflicts of local vs. global optimization are created. Local optimizations destroying the whole cannot be incorporated.
One DNA polymer with a random sequence can carry a lot of peptide sequence information. In terms of the amount of information, longer is better to create a more complex chaos. This DNA polymer allows the creation of replicable chaos in each sac. Interestingly, chaos is needed for a network formation, but it negatively and destructively affects the network once it is formed. This formation of a network splits the original information into two: useful or useless for the network. Useless information for the network should be eliminated to suppress the potentially destructive chaos. Now, a longer DNA is not better at all. Just keep the valuable information for replication of the whole. Trimming of invaluable information occurs. I think that this likely happened in prokaryotes. Therefore, their genome has no extra information, but it is all useful.
This will be efficient and stable. Interestingly, chaos is necessary for anything innovatively new to emerge. Because it is all useful, all further changes are destructive, compromising the original stable property. Then, how did eukaryotes, a cell with a nucleus and other organelles, emerge? The current best theory is the fusion of multiple prokaryotic organisms. Chloroplasts and mitochondria are examples of symbiosis, maintaining their compartments. I speculate that many more prokaryotic organisms were fused and shared their DNA polymers and biochemical networks. A new chaos. Some organisms stayed as unicellular organisms. Thus, the unit of selection remains on each cell, and unnecessary information was trimmed again. On the other hand, multicellularity also emerged, most likely because some peptides in a chaotic pool of peptides can stick out from the plasma membrane and interact with the environment and other cells. Then, they make an upper nested network as multicellular organisms.
The most significant impact on evolution during the establishment of multicellularity is the shift of ‘the unit of selection.’ From an individual cell to an organism. If every cell in an organism is equal, they need the same information. The chaos can be trimmed in all cells in the same manner. If each cell uses different parts of the information, useless in one cell type could be useful in others. The DNA polymer carrying the pool of peptide information cannot be easily trimmed. Then, unique suppression mechanisms and rules emerged to control useless information and chaos, like DNA and histone methylation and transcriptional/translational regulations. Those DNA regions are often called non-genic, considered as not encoding anything useful. However, non-genic regions carry hidden peptide information for ancestral chaotic biochemical reactions. In normal healthy conditions, the chaos is actively suppressed in multicellular organisms. This means the maximization of local optimization (trimming DNA) is prevented for the sake of the whole because the whole, an organism, is the unit of selection, not a cell. If an organism's upper network is destroyed, this is ‘death’ for the organism despite its lower networks, like individual cells are alive and functional.
The evolvable systems are not easy to be perturbed and eliminated. Because they are robust, the systems can find a way to overcome the perturbation. The replication system mainly uses clonal selection to deal with perturbation. On the other hand, the evolvable network uses the plastic route selection as detours. As I discussed above, each cell of unicellular organisms and each organism of multicellular organisms are the unit of selection. Each cell in a multicellular organism is constrained in a whole, and its evolvability is actively suppressed.
Cancer is a disease in which the active suppression mechanisms against the inner chaos in a cell are compromised. This permits it to regain its evolvability as individual cells. Because cancer cells originated from our own cells, they can communicate our network consisting of the whole as an organism. The best way to treat this is physically taking them out to disengage them from the network of the whole because ‘death’ for an organism is the destruction of the highest network for the whole. Therefore, surgery is still the top of standard cancer treatment options.
The second treatment option is chemotherapy, the chemical elimination of cancer cells. By targeting the unique biological properties of cancer cells, cancer cells are specifically highly stressed and kill themselves using their biological mechanisms. The third treatment option is radiotherapy, which uses the same concept of applying high stress to kill them. Radiation is toxic to cells, particularly proliferating cells. By exposing high-dose radiation locally, the cells in the exposed area are highly stressed and kill themselves. These two approaches work well to reduce the tumour mass. However, resistance often develops before fully getting rid of all cancer cells. My fundamental question is whether applying high stress on cancer cells to activate their own killing system has a chance to remove all cancer cells. Are they fundamentally a good strategy? In my view, they are not. Applying high stress compromises the active suppression for the expression of the chaotic peptide pool hidden as non-genic information in the genome. In both chemo- and radiotherapy, many cancer cells die due to high stress, but because of high stress, the cells gain access to the chaotic peptide pool that provides detours to bypass the stress and death pathways. The specificity and efficiency of the detours are not an issue. Non-specific or low efficiency is enough for escaping from death. Many detour choices are hidden as available 3D interfaces of peptides in non-coding sequences for transcription and out-of-frame coding sequences for translation.
The complete elimination of the biologically evolvable system by biochemical strategies is empirically impossible because the evolvable system is robust, plastic and selectable. In addition, high stress will further enhance robustness and plasticity by releasing the inner chaos of biochemical reactions. Cancer is challenging to treat because cancer cells fully use the advantages of both evolvable systems, clonal selection and network plasticity. Replication errors create variation among cells. The robust plastic networks within a cell and in the surrounding environment always accommodate them without impacting the whole through compensation. Clonal growth is inevitable when cells gain the no-death property. Even without acceleration of proliferation, the homeostatic basal cell-division ability is sufficient for this.
As far as these cells play only their inner networks of biochemical reactions, this is not cancer yet. The patient's death is the consequence of the corruption of the highest nested level of our biochemical networks in the body. The lower inner networks progressively modulate the nested upper networks of the local environment through extracellular biochemical networks, eventually reaching the highest ones for systemic homeostasis. This is Cancer and cancer death.
As I keep discussing here, the robust networks can compensate (in neutral words, buffering) various changes/insults without impacting the integrity of the whole. This is what happens during carcinogenesis. Irreversible genetic or genomic changes are introduced during replication by errors within a cell. The robust biochemical networks at each nested level within the cell, tissue environment, or patient minimize their impacts. The changes caused by the intrinsic irreversible errors are not recognized as an abnormality like pathogen infection. Therefore, cancer patients usually do not notice their cancer until the late stages when systemic symptoms are sensed. Because our homeostasis system is the nested layers of robust biochemical reaction systems, it compensates for various inner changes to protect the integrity of nested upper layers. Unless recognized as external insults, the immune system cannot eliminate them.
Aiming for 100% elimination with biochemical strategies is impossible, and applying high biochemical stress expecting the activation of self-killing activities may not be a good approach after all. Instead, we could consider co-living with them, like accepting to live with a parasite. Induction of dormancy would be one unexplored direction. I interpret the success of Gleevec as an unintentional induction of the dormant condition. Biochemically disengaging cancer cells from the networks of their surrounding environment should be considered as an alternative strategy. Applying stress to cancer cells should be avoided. Balanced co-living should be regarded as like the ecological system in nature.
コメント