Pages:
Author

Topic: Something new - biology / genomics / genetics / molecular bio questions answered (Read 1760 times)

newbie
Activity: 37
Merit: 0
Is it really true that suspended or "colloidal / ionic" silver is anti microbial? that say copper is also anti microbial in normal metallic form, also atomised citrus oils. I don't know where i read these things or why i remember them.

Yes for silver and copper. To what extent I do not know (= I do not know what dose to use and how - sprinkle on surface?).

Noteworthy, copper is also used on intrauterine contraceptive devices as well. They are designed to have specific surface area to release copper cations in order to kill (or at least inhibit) sperm and therefore fertilization.

Citrus oils - seems so. Here is one link to abstract about it:
http://www.ncbi.nlm.nih.gov/pubmed/23381618
Though the concentration looks bit high compared to for example antibiotics.

As if to use colloidal silver by ingestion, I would say no. I do not use it. You can turn blue from high doses. Grin Ok, seriously, that's true.
And in the end, all of it would sound differently if someone would try to sell colloidal lead and try to say its beneficial.

--
The answers are for free. However I accept donations, since grad students are not paid so well...
16a1YmEJwR3vZXdKAq65QANMYBdzTGCgiE
full member
Activity: 154
Merit: 100
I wonder if potassium chloride is a better antimicrobial  than sodium chloride since a significant portion gives off alpha particles. I once got a gieger counter and a bunch of the stuff and sure enough it was radioactive. Later there was a dead spider found in the salt.

I had to check it out. Smiley It works! I got elevated counts, nothing spectacular though. Potassium 40K, which is just 0.012% is a long-term radioactive nuclid. But I doubt that it is high enough to kill any bacteria. Those guys are very radiation resistant.
Also, rubidium chloride, which used to be commonly used in molecular biology is even more radioactive.


Is it really true that suspended or "colloidal / ionic" silver is anti microbial? that say copper is also anti microbial in normal metallic form, also atomised citrus oils. I don't know where i read these things or why i remember them.
newbie
Activity: 37
Merit: 0
I wonder if potassium chloride is a better antimicrobial  than sodium chloride since a significant portion gives off alpha particles. I once got a gieger counter and a bunch of the stuff and sure enough it was radioactive. Later there was a dead spider found in the salt.

I had to check it out. Smiley It works! I got elevated counts, nothing spectacular though. Potassium 40K, which is just 0.012% is a long-term radioactive nuclid. But I doubt that it is high enough to kill any bacteria. Those guys are very radiation resistant.
Also, rubidium chloride, which used to be commonly used in molecular biology is even more radioactive.
hero member
Activity: 728
Merit: 500
I wonder if potassium chloride is a better antimicrobial  than sodium chloride since a significant portion gives off alpha particles. I once got a gieger counter and a bunch of the stuff and sure enough it was radioactive. Later there was a dead spider found in the salt.
newbie
Activity: 37
Merit: 0
We know that ionic salts, in sufficiently high concentrations, can inhibit microbial growth.

Different salts affect microbial growth differently. Some of the factors that can change the impact include: toxicity, acidity, concentration, etc. What are some of the other factors?

* toxicity of the salt itself - salts of high molecular weigh metals (copper, silver, gold, cadmium) tend to kill or inhibit microbes, there are also some salts where anions are toxic, like azides or cyanides
* concentration of the salt, also by putting the cell in osmotic stress if the conc. is high enough
* pH of the solution - so how acidic or basic the solution is after dissolving the salt. This effect rather just helps main toxicity of the salt by enforcing additional burden to the microbial cell, which is now outside of its optimal pH range. Also it may help in uptake of the salt.
* specificity of the toxic action of the salt - for example, some salts, just have broad toxic action, like blocking active sites of many enzymes, but with small affinity. So they just bind and then unbind in a form of competitive inhibition. But if you have something which is specific to a certain enzyme, then usually has high affinity and in some cases form covalent bond, then inhibition is permanent and the enzyme is dead. Cyanides for example inhibit an enzyme in the electron transfer chain of mitochondria. But this is only for eukaryotic microbes (am not sure if it is right, you probably meant microbes=bacteria), like amoeba, since those have mitochondria.
But I guess you can put this point in the toxicity point.
*Whether exposure is acute or milder, but prolonged. It is probably better to use high dose and not to increase it gradually, because then microbes do not have time to switch on the compensatory mechanisms (express genes which will regulate uptake of salts).
*Temperature - this may pose an additional burden to the bacterial cell or help in the uptake (I think).
*Presence of other salts or antimicrobial compounds = additional burden to the cell.
*Lack of the food source = no energy for compensatory mechanisms, like active transport of the salts out of the cell.

I can't think of any more, but I bet that with more details about the situation it is possible to find some more factors affecting the antimicrobial action of salts.

newbie
Activity: 37
Merit: 0

This is amazing thank you, what would be needed for this process to be a viable automated computing method?  If it becomes easily implementable then won't that spell the end of certain types of cryptography?

P.s i'm just as broke as you so can't send you any tips.

Unless someone will find a way to make a somewhat programmable computer, I doubt it will have any impact. And this experiment takes time plus few days of work. If yours steps are enzymatic reactions, then it is going to stay like this. There is no nice scheme yet, where you would just mix everything together wait a bit and readout with sequencing. I saw some papers on DNA cryptography, some interesting ideas, but nothing too practical, nothing which beats computers yet.
I have heard someone is experimenting with DNA for breaking of some form of cryptography, but I did not have time to track this and get details. It may be bogus, or just unsuccessful attempt. If something like this works, then it will come out big.

But I think we are tantalizingly close to storing information in DNA. We are already doing that in our experiments. It just simply costs a bit and you need a company for readout.

For example, I can order a 500 base pair long DNA fragment (1000 bits) of arbitrary sequence. It costs 99$. So I order it, take the tiny drop and put on a piece of paper. I let it to dry and encircle the place where a drop was. I send it to you in a letter, you cut this out, soak in water and send this water to a sequencing company. It costs about 10$ to read out. You will get results in few days. I guess that's a nice example of steganography.
We can also synthesize an oligo library, which is 22 000 fragments of 36 nucleotides each. It costs much more, but it is 1584000 bits, 198000 bytes. Well, close enought to 5.25" floppy disk (320kB). Smiley Reading is also pretty costly too.
legendary
Activity: 1246
Merit: 1077
We know that ionic salts, in sufficiently high concentrations, can inhibit microbial growth.

Different salts affect microbial growth differently. Some of the factors that can change the impact include: toxicity, acidity, concentration, etc. What are some of the other factors?
full member
Activity: 154
Merit: 100
DNA computing.

Since the DNA and enzymes which work on it are so well studied, there can be used for many interesting manipulations.

DNA does not offer anything better for computing (it is definitely harder to program), except for the numbers. If I get a typical amount of 0.5 ml of 100 micromolar dilution of the oligonucleotide from commercial synthesis company for $10 that's about:

500 microliter * 100 picomol/microliter = 50000 picomol. 1 mol is 6.023*10^23, so: 50000 * 10^-12 * 6.023 * 10^23 = 5 * 6.023 * 10^4 * 10^11 =~ 3 * 10^16 molecules.

First implementation of DNA computing was for a travelling salesman problem, which is NP hard. With growing n=number of cities, the number of possible solutions explodes. For even 7 cities the combination number is 360, it grows faster and faster past that.
Leonard Adleman has used DNA fragments for all possible edges of the graph, with their length being proportional. The nodes were ends of those DNA molecules. DNA is a double helix - has two strands, it is possible at the end for one strand to be shorter than other, like this:

ACTGA
TGACTACGAGA

this part will have interesting property of forming relatively stable interaction with part with compatible end:

     TGCTCTCGACGC
ACTGA::::::GCTGCG
TGACTACGAGA


And this interaction can be later fixed by the reaction called ligation which makes 1 DNA molecule out of those two:

ACTGATGCTCTCGACGC
TGACTACGAGAGCTGCG

Adelman took DNA fragments representing all the edges of the graphs for the 7 cities, mixed them in the tube and they all started to link via compatible ends (which represent cities). Then he did ligation on those fragments and he got DNA fragments representing all the routes possible between cities.

Of course, some of them where much longer, some much shorter. So he had to select ones having all 7 cities.
Now, I mentioned that this link between fragments of DNA represents a city. DNA also has the property that if we separate strands of the double helix, then this strand will look for a compatible (complementary) sequence. If we have small fragments of single stranded DNA complementary to a city attached to a small polystyrene bead, then we can select all the DNA fragments having this city in path (they will stick to the DNA attached to those beads).
So if we do a step by step selection for all seven cities, we will have only fragments with those 7 cities in the path. Now we simply need the shortest one, but that's easy, because there are size selection methods.
So Adelman did all of that in a couple of days and sequenced the shortest strand, and voila, he had the optimal solution to the problem.

Of course it is not so practical, but that was first proof of principle demonstration.





     



This is amazing thank you, what would be needed for this process to be a viable automated computing method?  If it becomes easily implementable then won't that spell the end of certain types of cryptography?

P.s i'm just as broke as you so can't send you any tips.
newbie
Activity: 37
Merit: 0
DNA computing.

Since the DNA and enzymes which work on it are so well studied, there can be used for many interesting manipulations.

DNA does not offer anything better for computing (it is definitely harder to program), except for the numbers. If I get a typical amount of 0.5 ml of 100 micromolar dilution of the oligonucleotide from commercial synthesis company for $10 that's about:

500 microliter * 100 picomol/microliter = 50000 picomol. 1 mol is 6.023*10^23, so: 50000 * 10^-12 * 6.023 * 10^23 = 5 * 6.023 * 10^4 * 10^11 =~ 3 * 10^16 molecules.

First implementation of DNA computing was for a travelling salesman problem, which is NP hard. With growing n=number of cities, the number of possible solutions explodes. For even 7 cities the combination number is 360, it grows faster and faster past that.
Leonard Adleman has used DNA fragments for all possible edges of the graph, with their length being proportional. The nodes were ends of those DNA molecules. DNA is a double helix - has two strands, it is possible at the end for one strand to be shorter than other, like this:

ACTGA
TGACTACGAGA

this part will have interesting property of forming relatively stable interaction with part with compatible end:

     TGCTCTCGACGC
ACTGA::::::GCTGCG
TGACTACGAGA


And this interaction can be later fixed by the reaction called ligation which makes 1 DNA molecule out of those two:

ACTGATGCTCTCGACGC
TGACTACGAGAGCTGCG

Adelman took DNA fragments representing all the edges of the graphs for the 7 cities, mixed them in the tube and they all started to link via compatible ends (which represent cities). Then he did ligation on those fragments and he got DNA fragments representing all the routes possible between cities.

Of course, some of them where much longer, some much shorter. So he had to select ones having all 7 cities.
Now, I mentioned that this link between fragments of DNA represents a city. DNA also has the property that if we separate strands of the double helix, then this strand will look for a compatible (complementary) sequence. If we have small fragments of single stranded DNA complementary to a city attached to a small polystyrene bead, then we can select all the DNA fragments having this city in path (they will stick to the DNA attached to those beads).
So if we do a step by step selection for all seven cities, we will have only fragments with those 7 cities in the path. Now we simply need the shortest one, but that's easy, because there are size selection methods.
So Adelman did all of that in a couple of days and sequenced the shortest strand, and voila, he had the optimal solution to the problem.

Of course it is not so practical, but that was first proof of principle demonstration.





     
newbie
Activity: 37
Merit: 0

Haha this is a mirror image of me. I'm sure I've said those same lines somewhere on this forum. Hope you didnt look up my old posts and are trolling me, because it's uplifting to know other people see it the same way.

edit:except the "die as scientists" part. I think it is more just an issue of not having the time to learn stats and maths when it is acceptable to keep doing it the way its being done. Most researchers want to be good scientists but are stuck in a situation in which that is not possible or very very difficult to accomplish (for social reasons).


Fortunately, there are few people who also think critically and want to see the change.
But I think you are right that they want to be good scientists, but have to concentrate on current research and forget about everything else. I grew up being interested very broadly in different subjects, like: computer science, chemistry, biology, physics. I was stunned to realize at the uni that people specialize so much.

But I have to say, I was always afraid of mathematics. I was good at it at high school and at the early University, but the level of hardcore mathematics department at my uni was unattainable to me. So, I grew up with respect to it. It was before I was 22. But now, my interest in maths is reborn and I am getting into more and more advanced stuff. So, maybe a person needs to get older a bit to really get it?

hero member
Activity: 728
Merit: 500
How many of the signalling molecules and genes getting expressed we study are actually cycling through high and low activity states (eg bistable) throughout the day without any experimental manipulation? I ask because of all the n=3 researchers that hang their hat on large effect size, which seems to me to mean implicitly assuming that the activity/expression follows a normal distribution.

I should add a disclaimer: I don't think it would even be possible to not do n=3 experiments and survive as a molecular biologist these days.


I feel that I am just partially following your question. So, help me out if I don't nail it.
So, the first part - how many of the genes are cycling through low and high level of expression during a period of time without any manipulations. I believe it is many. I have read articles proving existance of such oscilating networks of genes. But people in biology don't care, since most of the experiments take a readout from a population of cells. The result is that you will get an average between those states, exactly as you are saying.

This is what people were satisfied so far, because there were no methods working on single cells. Now, the situation has changed, there are single cell qPCRs and sequencing methods. They are very experimental and immature at the moment, so not widely adopted at all. I was collaborating with people who were doing single cell qPCR for Embryonic Stem Cells, and they have observed some of those genes switching between low and high states, and as far as I remember, in the low state the cell was more susceptible to differentiation queues. So I have at least one first hand relation that something like that happens.

The problem is that not many researchers are thinking about this problem. The field has big inertia, people will keep on doing what they were doing for years. They were limited to assaying populations of cells, so they will keep on doing it the old way, even though single cell methods are available now.

The other problem is that biologists are limited in ability of grasping mathematical concepts. I think most actually do not realise that the value which they measure, the average, may in fact be a product of not normal, but bimodal or trimodal distribution. Well, it is worse than that, noone, except for the splicing guys care dor alternatice splicing isoforms, and this is what was around for quite a while...

Why it is like that? I know some guys just like doing experiments, even tough some of them do not make sense. Complete waste of money and time.

There is this systems biology movement now, which is supposed to grasp the cell, given all this data on hand. I think this is one of the problems which they are facing. Some of the data on expression may be not usable because of those oscillations. They will have to collect data in a systematic way, preferable on a single cell and then work out what oscilates. But this is huge amount of effort.

As for doing just customary 3 replicates... This is why I really don't bother reading articles anymore, I have no guarantee, that they are not a pile of bs, statistically insignificant. Researchers simpy carry over what worked for measuring seed or stem size to genomic realm and this inevitably fails.

I am nearing my end-cycle as a graduate student and I am disillusioned and cynical about the situation. I am just trying to look for invariables, something which cannot really depend on how sloppy the researcher is. But many people are in the bliss of ignorance. I just hope I will live long enough to see them die as scientists. Without statistics and mathematics this field is a cargo science.


Haha this is a mirror image of me. I'm sure I've said those same lines somewhere on this forum. Hope you didnt look up my old posts and are trolling me, because it's uplifting to know other people see it the same way.

edit:except the "die as scientists" part. I think it is more just an issue of not having the time to learn stats and maths when it is acceptable to keep doing it the way its being done. Most researchers want to be good scientists but are stuck in a situation in which that is not possible or very very difficult to accomplish (for social reasons).
newbie
Activity: 37
Merit: 0
Ok, a bonus answer.
At one of the lectures at my uni, an older professor, explained to us why we die.
He started with an example:
Imagine that you have a bar with 100 beer glasses. Glasses, as they are, can last almost indefinitely.
People will use them, they will be cleaned and reused again. All looks great, but in a certain period of time (a month, a year, whatever) there is a certain small
probability that the glass will be dropped and will shatter. You can make this probability small (thick walled glasses), so this 100 glasses will last a while,
but eventually there will be nothing left. This is why they need to be replenished. This is where reproduction comes into play. Let's say, we will be
buying new glasses to replenish the original ones, when they will hit a number 50 remaining ones. Each of the new batches will be a bit different, so you can track what remains from the batch.
I bet you can see how the beer glass population would look after a while.

The point is, if we were immortal, then after some time, be it 1000 years or 10000 we will most likely be dead from a random cause.
World is a dangerous place and it was even more in the past. This is why it is better to invest in reproduction and upbringing of the
offspring. So there is a evolutionary pressure to have offspring. This is why s*x is so pleasant and why parents are so obsessed about their
children (I bet you have seen some FB pages of young moms).

Now, why life with having offspring all the time cannot continue indefinitely?
If let's say there is a period of time, let's say 50 years, after which statistically 90% of people will be dead from random cause (animals, disease, etc.)
then there is really no pressure to live longer. There is no selection for longer life. So this is the age when all the imperfection can show up,
like heart diseases, cancer and so on.
newbie
Activity: 37
Merit: 0
How many of the signalling molecules and genes getting expressed we study are actually cycling through high and low activity states (eg bistable) throughout the day without any experimental manipulation? I ask because of all the n=3 researchers that hang their hat on large effect size, which seems to me to mean implicitly assuming that the activity/expression follows a normal distribution.

I should add a disclaimer: I don't think it would even be possible to not do n=3 experiments and survive as a molecular biologist these days.


I feel that I am just partially following your question. So, help me out if I don't nail it.
So, the first part - how many of the genes are cycling through low and high level of expression during a period of time without any manipulations. I believe it is many. I have read articles proving existance of such oscilating networks of genes. But people in biology don't care, since most of the experiments take a readout from a population of cells. The result is that you will get an average between those states, exactly as you are saying.

This is what people were satisfied so far, because there were no methods working on single cells. Now, the situation has changed, there are single cell qPCRs and sequencing methods. They are very experimental and immature at the moment, so not widely adopted at all. I was collaborating with people who were doing single cell qPCR for Embryonic Stem Cells, and they have observed some of those genes switching between low and high states, and as far as I remember, in the low state the cell was more susceptible to differentiation queues. So I have at least one first hand relation that something like that happens.

The problem is that not many researchers are thinking about this problem. The field has big inertia, people will keep on doing what they were doing for years. They were limited to assaying populations of cells, so they will keep on doing it the old way, even though single cell methods are available now.

The other problem is that biologists are limited in ability of grasping mathematical concepts. I think most actually do not realise that the value which they measure, the average, may in fact be a product of not normal, but bimodal or trimodal distribution. Well, it is worse than that, noone, except for the splicing guys care dor alternatice splicing isoforms, and this is what was around for quite a while...

Why it is like that? I know some guys just like doing experiments, even tough some of them do not make sense. Complete waste of money and time.

There is this systems biology movement now, which is supposed to grasp the cell, given all this data on hand. I think this is one of the problems which they are facing. Some of the data on expression may be not usable because of those oscillations. They will have to collect data in a systematic way, preferable on a single cell and then work out what oscilates. But this is huge amount of effort.

As for doing just customary 3 replicates... This is why I really don't bother reading articles anymore, I have no guarantee, that they are not a pile of bs, statistically insignificant. Researchers simpy carry over what worked for measuring seed or stem size to genomic realm and this inevitably fails.

I am nearing my end-cycle as a graduate student and I am disillusioned and cynical about the situation. I am just trying to look for invariables, something which cannot really depend on how sloppy the researcher is. But many people are in the bliss of ignorance. I just hope I will live long enough to see them die as scientists. Without statistics and mathematics this field is a cargo science.
newbie
Activity: 37
Merit: 0
Thanks! I sent you a small donation. (I hope it will be enough for you to buy a house with one day.)

Thank you ribuck! This is much appreciated.

Please guys, ask questions. No question is too small or too silly. We may stumble on something interesting.

Also I have decided to type first in the editor, so hopefuly next answers will be more concise.
hero member
Activity: 728
Merit: 500
How many of the signalling molecules and genes getting expressed we study are actually cycling through high and low activity states (eg bistable) throughout the day without any experimental manipulation? I ask because of all the n=3 researchers that hang their hat on large effect size, which seems to me to mean implicitly assuming that the activity/expression follows a normal distribution.

I should add a disclaimer: I don't think it would even be possible to not do n=3 experiments and survive as a molecular biologist these days.
newbie
Activity: 37
Merit: 0
Thanks for that!

Although there is no mutation in Conway's Life, there is mutation when memes reproduce, so we need a definition that excludes memes. Also, given the right environment, prions could be said to multiply.

Defining life as cells works, but it's hard to define a cell without introducing the same difficulties with metabolism as when defining life.

Was thinking a bit about it. At first I was opposing exclusion of memes and prions. But, frankly they are boring, because most of the information for their reproduction is in the Universe. So prions are reproducing if they are supplied with the Universe with our set of rules plus supply of non-misfolded prion protein. Most of the information about the prion is in the building block used for reproduction. I think it is appropriate to call them replicating systems, but life needs to be something more.

And then I thought that the clause of "assembling itself from simpler blocks based on information contained within themselves" is the good one. With this clause prions are excluded and memes are also excluded, because most of the information for their replication is either in the building block (prion) or in the Universe (meme - Universe is us replicating a meme).

So then cellular life is life (replicates and builds itself from smaller blocks), viruses are life (if their Universe has cells), von Neumann machine is life (in its own universe it builds itself based on the program on tape, does it from cells on the grid), DNA in PCR reaction is life (replicates itself based on the information within itself, does it from simple building block, universe has to have our quantum mechanical rules and a functioning Taq polymerase).

Why I think describing Universe is necessary in the definition? Well, even cellular life, which is clearly a life, needs quantum mechanical laws for molecules to function. Those laws can be encoded as information and without this information, those rules, even exact state of all the atoms in the cell is useless. So some information is within replicating mechanism, some information is in the Universe. Also it helps to clarify what we are talking about. Plus kind of shouts the mouth of people saying: what, DNA in PCR is life??? Yeah, for me, this behavior is life. It is for you to decide if such defined universe is interesting or not.

Clearly, those are my own ideas. Generally field does not think about it.

I have lots of fun answering those questions so far.
newbie
Activity: 37
Merit: 0
What's the current thinking on the rigorous definition of "life"?

This is not really thought about or discussed. The definitions I saw were simply list of features, so nothing rigorous.

I guess the current consensus would be that cellular life is life, while viruses and such, when outside the cell are dead as a rock. In short, you need a metabolism to be alive and especially ability to reproduce yourself. In fact the reproduction is central to the question.

But I thought about it a while ago. I think that you cannot think about life without environment - which is space + what you need for reproduction. Why I think it is important - just give me a moment. Even for cells it gets a bit hairy, because there are just few bacteria or archea which can survive and replicate only on simple chemicals, like CO2, some nitrogen compounds, phosphates and other inorganics plus light energy (or chemical energy from inorganics). Animal cells require a mixture of aminoacids, glucose, quite a lot of ingredients in order to divide. Similarly animals or plants, even small or unicellular. They cannot really do anything if they are in pure pure water for example.
So clearly even "alive" cells cannot do anything without environment.  So let's extend it and define a life as something with the ability to reproduce in a given environment. Since I consider some abstract places as environment, let's call it universe and let's assume its both things required of reproduction, as well as set of rules.
So one universe where life exists is clearly ours. The other is Conway game of life. There are patterns, which have the ability of reproduction - different universe, different set of rules. If you define a Universe as a tube with dNTPs and Taq polymerase where the temperature is cycling, then a single DNA molecule is alive, because with those conditions it can reproduce (PCR reaction).
You have a von Neumann automaton universal constructor which has ability to reproduce given proper program. It even can carry over mutations in a program, just like cellular life does. But it exists in a 2D grid universe where each cell on the grid can be in one of the 29 states. Quite different universe from ours, right?

So why not say all of those things are alive, if you define your Universe accordingly. The only thing about it would be that some Universes (like ours) are more interesting than others (like Conway's - sorry, I was excited it replicates, but the pattern is so simple, it is not interesting, plus it does not have this mutation carry over ability).
Then viruses are also alive if you define the Universe as one where cells are taken for granted, while they are dead in the Universe with no cellular life.
Computer virus is more boring, because it would depend on the highly sophisticated ability of copying provided in hardware. In this case the Universe is a wonder machine, which does the replication, computer virus just asks for it. But even then I would say "it is alive!".

Ok, enough for now, but I may write more about it in the future, because I really like this concept. If there are any questions, please ask. It is hard to write it clearly enough to be understood in full.
newbie
Activity: 37
Merit: 0
in W.Gibson anthology, there is a story about a lethal virus put in a hackered DNA synthesizer to kill the unaware researcher..i guess secret services switched to inorganics (maybe safer)
Yes it's unbelievable cool, I didn't complete degree, but first two year had been exciting...maybe we'll have biocryptography soon, and enzymes as public key

You should read a Ken Alibek book with a catchy title "Biohazard". It is about Russian bio weapon program. He apparently was working with a KGB guy who was constructing various contraptions designed to release powdered microorganism in the vicinity of a targeted person. The book is a little bit alarmist in tone overall, I can only say that some of the facts stated there are true (got another source confirming them), while validity of others I cannot verify.

Biocryptography is an interesting idea I did not think about.
I have seen many interesting things in works. Like cell-specific barcodes, which can then be used to detect which cell links to which cell (best used in the context of brain connectivity). Enzymes swapping chunks of DNA, deleting them, inverting them, etc. The problem with biological systems however is that we are trying to control a system with many variables and unless we control all of them we have unpredictable behavior of the system. Many times systems work 90% of time and then may go haywire once in a while. It is hard to construct a dependable system (Though not impossible, but it demands rigor in experiments). This is why I think all those cool experimental things are not put into practical use yet.

I think what may be coming is the DNA as a digital content storage. Would make sense because it has enormous information density and in fact evolved as a storage medium. We can sequence very old pieces of DNA, for example neanderthal human or some extinct mammals (~10000 years old , probably cannot go much older). That means it is pretty stable.
Sequencing has become cheaper and cheaper and we can sequence now ~ 600 Giga bases (600 bilions of nucleic acid bases - every base - 2 bits). This costs few grands. I think it will go down both in size and cost, there are some working prototypes of semiconductor based sequencing. Synthesis lags behind in throughput, but it is still pretty usable now. And we are using this all the time in the lab. We synthesize fragments of specified sequence (22000 different fragments of 30 - 40 base length) then we do something with them and in the end we sequence the result. It works, here and now. There was a recent paper that some guys has encoded Shakespeare Sonets in DNA and then sequenced it. I thought it was a publicity stunt, since it is nothing new, but it hit newslines.
Theorethical maximum for DNA is 3.6x10^21 bits per gram (I once calculated it and now forgot how), but in practice, there must be redundancy there, so lower than that. Still pretty large and I think you cannot go much more than that, because it is information encoded on the molecular level.
In the nutshell: excellent long term storage material.
Just imagine, one day you come home, backup a chunk of the internet and toss the tube into the -80C freezer.  Smiley

newbie
Activity: 37
Merit: 0
Do you think erk phosphorylation will change if someone sneezes in the next room over?

Good question! (At least that's what most of our invited speakers are saying when they try to buy time)

Erk is a protein in a pathway regulating cell division. Another protein may add phosphate to it making it active, thus regulating this activity (whether the cell will divide or not).
It is a really beautiful model where proteins in series phosphorylate each other (ras phosphorylates Raf and makes it active, which in turn phosphorylates MEK which phosphorylates Erk). The idea is that the signal is amplified in layers of this signalling pathway (one ras will phosphorylate several Rafs, each of this Rafs will phosphorylate several MEKs and so on), thus creating a strong ON signal which goes to nucleus and activates many genes responsible for cell division.

This pathway is very liked by biologist, because it makes sense, it is well researched, with hundreds of groups sitting on it and most important of all, it is pretty boring. So, when you write a grant about it reviewers feel competent to assess it and feel confident about it.

Does it mean that we really understand it in all details... Well I am pretty negative about it.

Even if we know most of the elements of this pathway, there are interactions with other pathways, which affect it. And it is worse than that, because this is only one pathway of many, and we know much less about the others. So, if you would liken it to the cpu with its registers and this phosphorylation of Erk would be one bit of a register, then we do not know how the machine state will change, various other bits would also switch and we are unsure which. Worse, there is some variablility to it. And cat ate most of the schematics.

The whole phosphorylation field is very muddy. If you think about the rigor of mathematics, then this field is exactly the opposite. So the answer to most of the questions is "definitely maybe".

So funny enough, for your question, even if the specialist in the field would answer "of course not, there cannot be any interaction", I would not think they have really strong foundations to answer you. Smiley
member
Activity: 88
Merit: 10
in W.Gibson anthology, there is a story about a lethal virus put in a hackered DNA synthesizer to kill the unaware researcher..i guess secret services switched to inorganics (maybe safer)
Yes it's unbelievable cool, I didn't complete degree, but first two year had been exciting...maybe we'll have biocryptography soon, and enzymes as public key


Simple idea: since there are many intelligent people in this crowd, I bet many of you are curious about stuff in the subject.

Biotechnology and genomics is a new cool thing and you can see many reports in media about it. What is common to those reports is lack of depth. Also biological knowledge has exploded, especially recently, so it is hard to grasp what is it all about without specialised knowledge.

I want to answer your questions and keep this thread interesting and entertaining. Fun to read, that's the main goal.

I am working in the lab, I am a graduate student and have been doing molecular biology research since 2002. I am an enthusiast of the topic, so my knowledge is sufficiently broad, I think.

--
The answers are for free. However I accept donations, since grad students are not paid so well...
16a1YmEJwR3vZXdKAq65QANMYBdzTGCgiE
Pages:
Jump to: