The schedule for the BSG conference has been posted. As you can see, I'll be talking about what Wolfram's complexity classes can tell us about Irreducible Complexity. I'm sure that Stephen Wolfram would cringe if he knew that's what his research was being used for. That makes me both happy and sad - happy because it was somewhat providential that I came to know about A New Kind of Science, and sad because I wish that I could repay Stephen for the good he and his company has done my family, and I don't think that he would take too kindly to the direction that I'm taking his work.
Two years ago I gave a talk to the BSG on how the genome acts as a metaprogramming system during VDJ recombination (see R14). I was wanting to get someone to do additional testing on this, or at least write it up as a proper paper before presenting it on the blog, but since I haven't had the time or resources for either, I figure I'll just go ahead and post it.
All computer programmers are inherently lazy - that's why the only thing we are good at is getting computers to do things for us. In fact, we're so lazy, if we can write a program for the computer to generate code for us, we will. Such programs - programs to generate programs - are called metaprograms. If you're interested in the computer programming aspect of metaprograms, I wrote three-part tutorial series on it (Part 1 | Part 2 | Part 3).
But the key parts of metaprogramming are these:
Metaprogramming systems are used to abstract away two things that make programming difficult:
Now, because metaprogramming systems are doing all of this, it necessarily means that metaprogramming systems are narrow in scope.
The cell can generate millions or billions of antibodies out of a relatively few number of genes. It does this by splitting antibodies into four regions - the variable region (V), the diversity region (D), the joining region (J), and the constant region (C). Each of these regions has multiple genes associated with it, separated by Recombination Signal Sequences (RSSs) and spacers. Heavy chain antibodies use all four types of regions, while light-chain antibodies just use V, J, and C regions. The constant regions are (surprise!) constant within an antibody class.
So, when B-cells mature, they pick a single gene from each of the V, D, and J regions, assemble them together, and join them to the constant region of the antibody. The VDJ regions specify the affinity of the gene towards an antigen (which is why it needs so much diversity), while the C region specifies the attachment to the cell (which is why it remains constant).
However, when VDJ regions are recombined, an interesting thing happens - a series of non-templated (N) and/or Palindromic (P) elements are inserted between these regions. Current efforts so far (as of the 2006 paper - I haven't kept up since then) have classified these as "random" insertions. What can the Creation model offer?
So, in V(D)J recombination, you have
So, in the metaprogramming model, what is the role of the rearranging system? It is to not only help the parts of the code go into their correct places and add-in the non-redundant parts, but to also manage the interactions of the parts, so that the recombine properly.
Therefore, if V(D)J recombination is acting as a metaprogramming system, then the probable reason for the addition of N and P elements is to manage the interaction of the parts. This lets the V, D, and J components evolve more freely without having to worry about how they will interact with the other parts of the recombination system. The recombination system worries about how the parts will interact.
So, is there any evidence that this is what is going on? Actually there is. In certain mouse antibodies, arginine is required at position 96 in order for the antibody to have proper affinity. Interestingly, this was always generated during recombination in the cases where it was required, even if it wasn't coded for by either of the joined segments! It appears as though the V(D)J recombination system knows that the arginine is required for affinity, and therefore is able to generate it when necessary.
Now, there are two possible pieces of counter-evidence which I am aware of:
However, the first one could simply be because either (a) the recombination system is directional but non-deterministic (i.e. it biases outcomes that are probably workable, but doesn't limit the outcome to a single possibility), or (b) there are additional elements at play, (c) both (a) and (b).
The second one could be the result of a non-deterministic system - that it only biases good results but doesn't guarantee them.
Obviously, this needs experimenting before it is taken as fact, but I think the evidence currently points in this direction.
If the V(D)J recombination system is actually a metaprogramming system, there are some other possibilities worth looking into:
The V(D)J Recombination system is a fairly standard metaprogramming system. However, in Computer Science we have another type of metaprogramming facility, called an "enterprise" metaprogram. In these, the specifications are actually specifications for multiple different subsystems. That is, a single template is run through multiple metaprogramming systems (one for each subsystem), and it can generate a unified, interacting system.
So, in biology, we would be looking for a system that recombined one way in one tissue, and recombined another way in another tissue, in such a way that variations in those genes would cause the two tissue types to change in coordination with each other. Alternatively, we might be more likely to find, instead of a recombination system, a mechanism of alternative splicing, so that one gene is spliced in different ways, depending on the tissue, but spliced in such a way that changes within the gene through evolution would cause coordinating changes in the protein products in multiple tissues.
NOTE - there are several claims here that are unreferenced. If you are interested in them, mention it in the comments, and I will try to look it up for you. As I said, it's been two years since I did this research, and it's been pretty much sitting on a shelf since then, so it may take me a few days to find.
I found this awfully funny.
The schedule to this year's Creation Biology Study Group meeting was just posted. Register by July 11th to get the early bird discount!
Note that both conferences are in the same location on the same week. ICC is on the first half of the week, and BSG is on the second half.
Many creationists hold various opinions about whether or not Dinosaurs lived very far past the flood. Some think they were not on the ark, some think they died out immediately after, some think they lived until the middle ages, and some think that a few of them are still hanging around today. I used to be bullish for the idea of present-day dinosaurs, but I've become slightly more skeptical. However, I still think it is an idea meriting discussions.
Today, I found in Google reader two interesting stories about modern or semi-modern Dinosaurs, one from CMI and one from AiG: