NOTE - this has been updated slightly.
At the ICC, the GENE team introduced a new genetics simulation tool which has an astonishing number of input variables - it seems to be really well thought out. The results of the simulations performed on it show Sanford's "genetic entropy" thesis - that the genetic load that comes with mutation far outweighs any beneficial mutations that may occur.
So the question is, why does evolution go downward? Sanford's conclusions are:
Sanford said that selection breaks down at the 0.001 fitness reduction level. At this point, selection is simply unable to remove the trait from the population. It is also harder to select away recessive genes.
Their software can also simulate a population bottleneck. It showed that this actually leads to a dramatic loss of fitness because of the rapid fixation of genetic damage. So, in the fitness graphs, a genetic bottleneck causes a temporary transition from a downhill slope to a downhill cliff. When the population recovers, it is back on the downhill slope.
Sanford said that this data should cause the following shifts in evolutionary thinking:
Now, personally, I wonder if some of this relies too much on Darwinian assumptions. Here are some basic issues:
Okay, so I probably need to explain directed mutagenesis a little more and why it impacts their model.
Historically, evolutionary thought has thought of mutations happening essentially haphazardly - that is, without any particular constraint (except perhaps incidental ones) on which DNA bases get modified. However, what if that assumption is wrong? There are several possibilities:
Anyway, it would really be interesting to see the application reworked with some of these concepts in mind.
In addition to the ICC conference, the BSG conference was held this week, so I'll be covering some of the talks given there throughout the week.
Kurt Wise gave an excellent presentation on one possible criteria for determining the extent to which mammalian (especially ark-based) baramins (a "Baramin" is a Genesis created kind - NOT equivalent to species) have diversified - the Post-Flood Continuity Criteria (PFCC), primarily based on data from the monograph Classification of Mammals, which had abundant data of the geological layers in which different mammalian organisms were found.
Kurt argued that the fossil record at the genus level for mammals is essentially complete, with some specific exceptions. Therefore, a given baramin should have a fossil record that goes all the way back to the flood, which, for this study, Kurt used the K/T boundary as the flood/post-flood boundary. Kurt argued from the data of the fossil record that, although we normally equate the baramin with the family as a first-pass approximation, many of these families do not go all the way back to the flood. However, if we extend this to the superfamily level, we often find extinct families which do go back to the flood. Kurt argued that these extinct organisms were the ancestors of the modern families of organisms.
Another interesting thing Kurt noted was that in order to go all the way back to the flood, you had to essentially connect all of Ruminantia to be part of the same baramin. What's even more interesting is that a friend had previously speculated just this very same thing to me on biological grounds (specifically, the uniqueness of the Ruminant stomach, and the fact that most of the other traits can be had by stretching/deforming basic morphologies). This would mean that cattle, deer, sheep, goats, and giraffes are all in the same baramin.
In any case, it is important (as Kurt emphasized) to keep in mind that baraminology is holistic, not reductionistic, and therefore no one criteria should be adopted for establishing baraminic continuity and discontinuity. Some of the issues with the post-flood continuity criteria are:
Instead of giving a step-by-step overview of Steve Austin's presentation, I'll just give the highlights. My fingers couldn't keep up with the typing last night, and I didn't follow all of the geology concepts.
He also referenced the "Bedform Stability Diagram" which shows how different-sized particles behave underwater in different currents. A form of the diagram is viewable here (on page 8 & 9), though it is much more complicated than the one he showed on his slide.
So, his points were:
He also pointed out an amusing story that as a graduate student, in order to get the laminae concept to work in the lab, they had to take mud, clean it, bleach it, and treat it with special chemicals before they could get it to form laminae by the traditionally conceived method :)
He also made several points about Kelvin–Helmholtz instability which went by too fast for me to understand.
He also suggested that Creationists should set up a racetrack flume for experimentations on this model.
[Again, my own comments are in brackets. The first talk I went to was John Hartnett's "Starlight, Time, and the New Physics." I didn't get to use my computer during it because it was standing room only. It doesn't matter too much, because most of it was beyond my current understanding. Apparently he is coming out with a new book with Carmeli, but he didn't say the title. Basically, what he was saying was that Newtonian physics occurred in 3 dimension, Einstein added a 4th (time), and Carmeli added a 5th (velocity). His theory can account for the motion of galaxies and the expansion of the universe without dark matter or dark energy. However, apparently it has a little more trouble accounting for our own, or perhaps he simply had more trouble explaining how it accounted for our own. I'll try to read something about it at a later date.]
Russ started by congratulating those of us who went to Hartnett's theory and made it through his equations :)
Ridiculously simple idea: God used water to make magnetic fields in the cosmos
Explains magnetic fields of: stars, galaxies, and planets
Hydrogen nuclei have magnetic fields. They spin slower - make a field 1/1000 of an electron. But in water, the hydrogen nuclei point any which way, so normally water is not magnetic. But it can be magnetic if the nuclei line up.
God formed the earth from created water. 2 Peter 3:6 "The earth was formed out of water and by water". So what would happen if God, when he made the earth, if he used water and lined up the proton spins? (obviously this was followed by binding the water together into other elements).
If all H-nuclei are aligned it will have a large magnetic field.
The field would be 7.9 Gauss at the poles (MRI is about 10,000 Gauss - we are curently in 0.5 Gauss field).
Created magnetism depends on mass
Original Magnetic moment = Planet mass * 0.94 * (A-m^2)/kg
Approximation = Original Magnetic moment is approximately equal to the planet's mass in kilograms
Lines of force = magnetic flux (Faraday)
In ordinary water, molecules collide, disorient spins, and start electric current in the water within seconds. As each spin got out of alignment, it would induce current.
Current at creation = 130 Billion amperes. Flux would be conserved (he said the reason, I didn't understand it).
Transforming to solid earth conserves magnetic flux and keeps the same mass (so the current mass would be the same as the mass of the original water).
The current runs down. - Starts at 130 Billion Amperes and decays to 6 Billion Amperes. 2,000 year half-life. Perhaps other things happened during the flood as well.
Half Life =~ (conductivity) * (radius^2)
Created flux decayed fast in smaller planets. Also depends on material.
We can deduce from decay rates what the conductivities of the cores are.
Two main groups - gas giants (gas) and terrestrial planets (rocky and iron).
Gas giants have low conductivity and terrestrial planets have high conductivity. Matches what we know from material science.
Humphreys made several predictions in 1984 in the Creation Research Society Quarterly, "The Creation of Planetary magnetic Fields." All of his 1984 predictions are NOW FULFILLED.
Other things that confirm theory but were not explicitly predicted in 1984:
Solar system data fits Humphrey's theory, but only with Biblical conditions (6,000 years and original water)!
Stars and Magnetic Fields
Galaxies and Magnetic Fields
How did God create galatic magnetic fields?
One scenario -
Universe may be God's biggest magnet
[This is an interesting, if EXTREMELY SPECULATIVE area, which I think was pioneered by the dude who did Creation's Tiny Mysteries (forgot the name). Anyway, I thought it was interesting, but it should be considered several orders of magnitude more speculative than the rest of the presentation.]
Magnetic fields show God's handiwork in the heavens.
Why can't we measure magnetic fields of planets but we can measure magnetic fields of stars? Answer - the magnetic fields of stars are large enough that they produce spectral effects.
Early equation - is there a theoretical basis for the number? Answer - it's in the book. Based on lining up protons.
On waters above - would it be directional or detectable? What would you look to find? Answer - I would expect some direction, and there's some intriguing astrophysical data that might point to that field, but we can't be sure. [mentioned some things about favored axes of radio wave spins and other things, but I wasn't paying enough attention - but ultimately the evidence is small]
Didn't hear question - magnetization of meteorites seems to imply that they were part of a larger body which was about earth-sized.
Plasma cosmology question - didn't know much of about plasma cosmology
Hartnett - Whole class of stars called "strange stars" - range from stars made of diamond (pure carbon) right up to quark stars, could probably be added to the graph of dots and it would probably line up.
Hartnett - something about rapidly spinning objects and event horizons
Hartnett - universe-sized magnetic moment - what about just treating galaxies as single-spin systems, and then add up total amount of galaxies - Audience comment - Harold Aston has done just this thing, but audience member did not know what the conclusion was.
We are not seeing galaxies at creation - we are seeing them after about 300 million years of winding (using their local clocks).
[Since the conference room does not have WiFi, I'll have to just "pseudo-liveblog" this thing, and then post it when I get back to my room :) ]
[The first session I'm going to is Kevin Anderson's "A Creationist Perspective of Beneficial Mutations in Bacteria". All my own comments are in brackets]
Advantages of studying bacteria:
Significant features of bacterial genome:
Many of these features were thought to only exist in higher organisms, but have been found in at least E. Coli.
Mutations maintain diversity through mutation. Bacteria have ability to intentionally mutate their genome.
Wild Type + mutant => (a) more fit, (b) less fit, (c) neutral - can be any one of these
Study of E. Coli after 20,000 generations (Lenski)
Lenski 1999 - mutant stains possessed 50% greater "relative" fitness compared to parent (for the given environment).
Schneider et al 200 and Cooper et al 2001 - the beneficial mutants were the result of genetic disruptions (knockouts) - i.e. they were all degenerative
IS Element activates promoter to provide expression of a gene.
IS Element might also have an active repressor which disrupts it.
spoT mutants -> decreased ppGpp -> increased tRNA and rRNA -> increases protein synthesis - starts with a disruption or reduction of the cell's control mechanism
Mutants were less fit in other environments, such as different temperatures.
Conclusions of Lenski's long-term adaptation study:
Increase temperature of E. Coli - get lots of mutants with gene duplications and deletions - genes involved in coping with higher temperature are the ones duplicated!
(Richle et al 2001, PNAS; Richle et al 2003 Physiol. Genom.; Richle et al 2005, Physiol. Biochem. Zool.)
Other studies show that when you return organism back to normal temperatures, duplications are removed [other studies not speicfied]
Antibiotic/antimicrobial resistance Mechanisms:
MarR - represses promoter so that marA and marB are not expressed
Mutant to MarR is repressed, marA becomes a promoter for the promoter region, which increases the system, and forms both marA and marB, which then becomes marAB.
Metroindazole activation - [could not follow this one quickly]
Erythromycin resistance in e. coli - loss of 11 bp segment of 23S RNA
Kanamycin resistance in E. coli [slide up too short to complete]
Anderson 2005 CRSQ has a list of phenotype resistance and genotypes, and shows the degenerative nature of genotype systems.
Bacterial Response to Starvation
Glucose-limited adaptation - two mutant organisms that work together: [these are two different mutants in different cells I think]
Hypermutations - impaired repair mechanisms - increases chance of "beneficial" mutations under stress conditions.
[PROBLEM - keeps on banging the "beneficial but degenerative" drum]
Directed or random? - talked more by Georgia on Wednesday
Cultivation of Lac- in medium with lactose as sole catabolite. Frameshift reversal occurs at a much higher than random rate.
Nylon Degradation - Nylon-degrading bacteria identified in 1980s. Assumed as the evolution of a new metabolic pathway. Most-commonly studied - Anthrobacter sp. K172. Ei (NylA) EII (NylB) EIII (NylC) - on plasmid
Carboxyesterase - original version will not metabolize nylon. EII has an active site has broadened specificity to process Nylon. Broadening specificity of enzymes is a degenerative process. Prediction - EI and EIII will be found to be a degenerative (broadening specificity) mutation. Same prediction for opp protein in his next example for transport proteins.
Several mutations at once, but all degenerative.
Citrate evolution after 30,000 generations. E. Coli in aerobic conditions cannot process citrate. Lenski has found E. Coli that can process citrate. NOTE - genetics of this has not been studied - only phenotype.
Perhaps all Enterobacteria are all same created kind. [Interesting!]
citT (citrate transporter) is expressed anaerobically. If citT is cloned into a shuttle vector (Martinus et al. 1998 - J. Bacteriol), E. Coli can utilize citrate aerobically. The only thing that needs to be done is activating or derepressing the gene. The only mutation may be the loss of citT regulation! [Superinteresting!]
Antagonistic Pleiotropy - Analogy of "beneficial" mutations to constructing a house - removing non-supporting walls to create a larger dining room. Lose a room, but gain a function. Doesn't explain how the house is constructed.
Creation Model - Rigid Flexibility - flexibility that goes only so far.
Bacteria can often get back to wild type by recovering systems.
Pennicilinase - only possible example of antibiotic resistance that has a chance of not being degenerative
Reversion - either genotypically - specific mutation reverts back, or phenotypically - a suppressor and then a repressor mutant.
Why is losing specificity a loss of ability, especially if V(max) of enzyme is not affected? Metabolism is managed by having very specific, narrow metabolic pathways, and showing a decrease in specificity will only cause long-term problems if compounded, because metabolics require tight specificity.
In debate, need to force evolution to show why the mechanisms they have examples for can contribute to large-scale evolution. If the mechanism is deregulation, then it can't be the source for novelty.
No current research on limits of baramins but there probably needs to be.
Why is it called "antagonistic pleiotropy" - seems to not be using "pleiotropy" in the strictest sense, but that's what the evolutionists have called it.
Isn't the reversion an increase in specificity? Couldn't other mutations increase specificity? There's no example of it occurring. [Isn't SMH in immunoglobulins an example of increasing specificity?]
[I think it was a good presentation, but I think he beat the degenerative drum WAY too much. My BSG presentation should indicate some ways in which organisms could theoretically produce increased specificity non-degeneratively (though there is always a tradeoff somewhere).]
[The second session I'm going to is Bob Hill's "The Tectonics of Venus and Creation"]
Venus as a prototype of Plate Tectonics - this is a review of secular literature in order to get it into the Creation literature. There are arguments for and against catastrophic plate tectonics in Venus. This is just the "pro" side in order to get the information into creationary literature.
From radar, Venus looks a little like the earth, but drained of its oceans
Magellan - radar-mapped 98% of surface
Geologic Structures on Venus
Venusian interior seems to have core, mantle, and crust.
What are Corona? Not found elsewhere. Possible explanation (based only on photographs) Two layers of rock. magma is injected between the two layers, lifting one layer up. It later drains and drops back down, leaving circular faulting.
Are there any structures on venus which are analogous to terrestrial structures like subduction zones? A few proposed, all hotly debated. The ones found don't look that much like subduction zones. Need more space probes!
Why would Venus have lid tectonics and the earth plate tectonics? Really unknown.
Did Earth have impacts at the same time as Venus? Probably, but we have weather but Venus doesn't.
Is there evidence from a young-earth timescale of a resurfacing event? Not much except it matches Catastrophic Plate Tectonics.
Faulkner has suggested two cratering events suggested.
Surface temperature of venus is fairly uniform throughout. Atmosphere keeps it going.
Uniform randomness is unique to Venus. What about Mercury? Currently looks nonrandom, but not enough mapping done. If Mercury is nonrandom, it indicates that there may in fact have been a recent (during/post-flood) resurfacing event on Venus but not elsewhere. [Very interesting!]
[Very interesting stuff!!! Certainly at the beginning of understanding, but it looks promising]
The schedule for the BSG conference has been posted. As you can see, I'll be talking about what Wolfram's complexity classes can tell us about Irreducible Complexity. I'm sure that Stephen Wolfram would cringe if he knew that's what his research was being used for. That makes me both happy and sad - happy because it was somewhat providential that I came to know about A New Kind of Science, and sad because I wish that I could repay Stephen for the good he and his company has done my family, and I don't think that he would take too kindly to the direction that I'm taking his work.
Two years ago I gave a talk to the BSG on how the genome acts as a metaprogramming system during VDJ recombination (see R14). I was wanting to get someone to do additional testing on this, or at least write it up as a proper paper before presenting it on the blog, but since I haven't had the time or resources for either, I figure I'll just go ahead and post it.
All computer programmers are inherently lazy - that's why the only thing we are good at is getting computers to do things for us. In fact, we're so lazy, if we can write a program for the computer to generate code for us, we will. Such programs - programs to generate programs - are called metaprograms. If you're interested in the computer programming aspect of metaprograms, I wrote three-part tutorial series on it (Part 1 | Part 2 | Part 3).
But the key parts of metaprogramming are these:
Metaprogramming systems are used to abstract away two things that make programming difficult:
Now, because metaprogramming systems are doing all of this, it necessarily means that metaprogramming systems are narrow in scope.
The cell can generate millions or billions of antibodies out of a relatively few number of genes. It does this by splitting antibodies into four regions - the variable region (V), the diversity region (D), the joining region (J), and the constant region (C). Each of these regions has multiple genes associated with it, separated by Recombination Signal Sequences (RSSs) and spacers. Heavy chain antibodies use all four types of regions, while light-chain antibodies just use V, J, and C regions. The constant regions are (surprise!) constant within an antibody class.
So, when B-cells mature, they pick a single gene from each of the V, D, and J regions, assemble them together, and join them to the constant region of the antibody. The VDJ regions specify the affinity of the gene towards an antigen (which is why it needs so much diversity), while the C region specifies the attachment to the cell (which is why it remains constant).
However, when VDJ regions are recombined, an interesting thing happens - a series of non-templated (N) and/or Palindromic (P) elements are inserted between these regions. Current efforts so far (as of the 2006 paper - I haven't kept up since then) have classified these as "random" insertions. What can the Creation model offer?
So, in V(D)J recombination, you have
So, in the metaprogramming model, what is the role of the rearranging system? It is to not only help the parts of the code go into their correct places and add-in the non-redundant parts, but to also manage the interactions of the parts, so that the recombine properly.
Therefore, if V(D)J recombination is acting as a metaprogramming system, then the probable reason for the addition of N and P elements is to manage the interaction of the parts. This lets the V, D, and J components evolve more freely without having to worry about how they will interact with the other parts of the recombination system. The recombination system worries about how the parts will interact.
So, is there any evidence that this is what is going on? Actually there is. In certain mouse antibodies, arginine is required at position 96 in order for the antibody to have proper affinity. Interestingly, this was always generated during recombination in the cases where it was required, even if it wasn't coded for by either of the joined segments! It appears as though the V(D)J recombination system knows that the arginine is required for affinity, and therefore is able to generate it when necessary.
Now, there are two possible pieces of counter-evidence which I am aware of:
However, the first one could simply be because either (a) the recombination system is directional but non-deterministic (i.e. it biases outcomes that are probably workable, but doesn't limit the outcome to a single possibility), or (b) there are additional elements at play, (c) both (a) and (b).
The second one could be the result of a non-deterministic system - that it only biases good results but doesn't guarantee them.
Obviously, this needs experimenting before it is taken as fact, but I think the evidence currently points in this direction.
If the V(D)J recombination system is actually a metaprogramming system, there are some other possibilities worth looking into:
The V(D)J Recombination system is a fairly standard metaprogramming system. However, in Computer Science we have another type of metaprogramming facility, called an "enterprise" metaprogram. In these, the specifications are actually specifications for multiple different subsystems. That is, a single template is run through multiple metaprogramming systems (one for each subsystem), and it can generate a unified, interacting system.
So, in biology, we would be looking for a system that recombined one way in one tissue, and recombined another way in another tissue, in such a way that variations in those genes would cause the two tissue types to change in coordination with each other. Alternatively, we might be more likely to find, instead of a recombination system, a mechanism of alternative splicing, so that one gene is spliced in different ways, depending on the tissue, but spliced in such a way that changes within the gene through evolution would cause coordinating changes in the protein products in multiple tissues.
NOTE - there are several claims here that are unreferenced. If you are interested in them, mention it in the comments, and I will try to look it up for you. As I said, it's been two years since I did this research, and it's been pretty much sitting on a shelf since then, so it may take me a few days to find.
I found this awfully funny.
The schedule to this year's Creation Biology Study Group meeting was just posted. Register by July 11th to get the early bird discount!
Note that both conferences are in the same location on the same week. ICC is on the first half of the week, and BSG is on the second half.