Adventures in obsolescence: DIY CO2 system -
This is a short write up of my DIY low tech CO2 system for a planted aquarium. It has been running for about 2 years now without issue.
- 1/4” airline tubing
- 2X 2 liter bottles
- 1X 12 ounce bottle
- 4X 1/4” fittings
- 1X 1/4” gang valve
- 1X bubble counter (optional)
- 1X diffuser
science :): The importance of stupidity in scientific research -
I recently saw an old friend for the first time in many years. We had been Ph.D. students at the same time, both studying science, although in different areas. She later dropped out of graduate school, went to Harvard Law School and is now a senior lawyer for a major environmental organization. At…
This is why if you want higher resolution, you go with black and white. It is both a creative way to interpret colors, and sometimes is a barrier when you want to maximize the content you can visualize, while maintaining color (such as when viewing fluorescently tagged microscopic particles. It’d be interesting if the color of an individual pixel can be varied as opposed to interpolating four to get one color. Perhaps this can be done by varying the current passed through individual pixels to produce the spectrum of colors. Nanotechnology to the rescue!
This is a Bayer array. This is how the sensors on almost all digital cameras work. Every pixel has a color filter that only permits certain wavelengths of light to pass through to the light well below, which can’t distinguish different colors. Every 2x2 square of pixels has one red, one blue and two green pixels: this is because our eyes are more sensitive to green, and we want our photographs to resemble what the eye sees. Digital images need one red, one green and one blue value for each pixel. But every pixel on the Bayer grid has only 1/3 of that information. The other two colors need to be interpolated (clever mathematical guesswork, basically) from adjacent pixels of the correct color. This process is called demosaicing, and there are many possible approaches. Some algorithms are significantly better than others. None of them can accurately recreate all the information that a sensor that records all three colors at every pixel would record.
The algorithms used in demosaicing are similar or identical to the ones used when resizing images (in both cases, you calculate approximate pixel values between known values). The simplest one is nearest-neighbor interpolation, where you simply copy a neighboring pixel of the correct color. More complex methods include bicubic and bilinear interpolation, which calculate values based on weighted averages of many adjacent pixels. Wikipedia has some examples of how these methods work on [0, 3]x[0, 3] squares. Note that the following are continuous, while pixel demosaicing operates on a discrete grid of pixels. Still, I think the pictures demonstrate how some algorithms are smoother than others.
The algorithms your camera or programs like Camera Raw or Lightroom use to demosaic images are more sophisticated, and slowly (almost invisibly) improving with new updates. If you have old RAW files, creating new JPEGs from them in modern software may give them a slight edge.
Bayer sensors are easier and cheaper to make than the alternatives. They aren’t likely to go away anytime soon. Sadly, all sorts of artifacts can result from demosaicing, and really clever tricks are needed to correct them.
I Want This (But a Computer) -
A co-worker and I were talking about this the other day. We would like this technology, but touch screen, with computational abilities. Ideally we could have it replace the glass sash of a bio-safety hood. You would use it as an in lab calculator, search engine, and even experimental data logger (on slow days it should resume its television functionality).
If anyone knows of transparent computer gadgetry already out there, please let me know.
this is such a cool idea. -
I think I discovered what my future house will be made out of.
Pesticide exposure in the womb may lower IQ -
Pregnant women may want to switch to organic produce – exposure to specific pesticides in the womb is linked to a reduced IQ among children.
The finding comes from three studies conducted in New York City and California’s Salinas Valley – known as “America’s salad bowl” because of its intensive cultivation of lettuce and other vegetables.
Metabolites of organophosphates were measured in the urine of pregnant women or in blood from the umbilical cord. In the Californian study, the women were divided into five groups according to these measurements. At the age of seven, the IQ scores of children of the women in the group with the highest pesticide exposures were on average seven points lower than those of women with the lowest exposures.
The study that this article is based on seems pretty solid. Once again, diet during pregnancy appears to have dramatic repercussions on the offspring further down the line.
When we consider scientific advancement, most of the time, we think of gaining new understanding that can help us create new technology and makes things easier for us. We rarely think of advancement in science as making things more difficult for us. However, this can often be the case.
Take climate change for example. More and more scientific data is gathered all the time supporting the hypothesis that we as humans have been harming the environment that we live in, leaving it in a state that is toxic to us. With this knowledge, we are left with the task of further research and study into the many ways that we can solve this problem discovered through scientific inquiry. Due to the sheer magnitude of the scientific undertaking that inevitably follows the conclusive evidence of environmental damage, it is understandable that there would be many interests that would drag their feet in supporting the initial research.
Like most things in nature, we tend to follow the path of least resistance. Even in experiments of smaller scale than the titanic study of climate change, many tests can be oversimplified or even ignored altogether because the results could lead to more complications down the line rather than if we had just been blind to the data. Now sometimes, we have to make judgment calls due to limited resources (time, funding, equipment). However, when we do so, we must never forget about the studies that we failed to perform, as they may be the answer that can solve a problem down the line. But when we can, it is imperative that we do are duty as scientists and follow through on the hypothesis with the complex answer.
I first became aware of the importance that many non-elite scientists place on “peerreviewed” or “refereed” journals when Howard Van Till, a theistic evolutionist, said my book The Physics of Immortality was not worth taking seriously because the ideas it presented had never appeared in refereed journals. Actually, the ideas in that book had already appeared in refereed journals. The papers and the refereed journals wherein they appeared were listed at the beginning of my book. My key predictions of the top quark mass (confirmed) and the Higgs boson mass (still unknown) even appeared in the pages of Nature, the most prestigious refereed science journal in the world. But suppose Van Till had been correct and that my ideas had never been published in referred journals. Would he have been correct in saying that, in this case, the ideas need not be taken seriously?
To answer this question, we first need to understand what the “peer review” process is. That is, we need to understand how the process operates in theory, how it operates in practice, what it is intended to accomplish, and what it actually does accomplish in practice. Also of importance is its history. The notion that a scientific idea cannot be considered intellectually respectable until it has first appeared in a “peer” reviewed journal did not become widespread until after World War II. Copernicus’s heliocentric system, Galileo’s mechanics, Newton’s grand synthesis—these ideas never appeared first in journal articles. They appeared first in books, reviewed prior to publication only by the authors or by the authors’ friends. Even Darwin never submitted his idea of evolution driven by natural selection to a journal to be judged by “impartial” referees. Darwinism indeed first appeared in a journal, but one under the control of Darwin’s friends. And Darwin’s article was completely ignored. Instead, Darwin made his ideas known to his peers and to the world at large through a popular book: On the Origin of Species.
I shall argue that prior to the Second World War the refereeing process, even where it existed, had very little effect on the publication of novel ideas, at least in the field of physics. But in the last several decades, many outstanding scientists have complained that their best ideas—the very ideas that brought them fame—were rejected by the refereed journals. Thus, prior to the Second World War, the refereeing process worked primarily to eliminate crackpot papers. Today, the refereeing process works primarily to enforce orthodoxy. I shall offer evidence that “peer” review is not peer review: the referee is quite often not as intellectually able as the author whose work he judges. We have pygmies standing in judgment on giants. I shall offer suggestions on ways to correct this problem, which, if continued, may seriously impede, if not stop, the advance of science. —
From: Refereed Journals: Do They Insure Quality or Enforce Orthodoxy?
by Frank J. Tipler
This is an introduction from an article written by a professor of mathematical physics at Tulane University. It’s an interesting take at the issues surrounding publishing any academic work in respected journals. His focus is on science, but it certainly is no big leap seeing that it applies to any field. A big caveat is that the article appears to be preparing the argument for the inclusion of papers that mix religion and science in the publication in scientific journals, which in my opinion should be kept separate. However, some of the points he makes are relevant. The increasing demand for papers by universities, while pushing science forward, may not be pushing it in the proper manner. It forces scientists more often than not to sacrifice the scientific integrity of their work for the sole purpose of getting a paper through, thus supporting themselves financially. It truly does stifle scientific progress when innovative papers that follow the scientific method and can lead to new discoveries are delayed.
I think that with the internet, we should be able to rework the way papers are published. Publishing and distribution should be and can be made. The onus is then on the scientist to create quality paper which have obvious merit. Then it becomes a question of how to compensate a scientist properly for quality work.
One country on earth finally grants nature (food source) its own rights (click to read article) -
“Controversially, it will also enshrine the right of nature “to not be affected by mega-infrastructure and development projects that affect the balance of ecosystems and the local inhabitant communities”.
That’s an interesting law passed. Though since it seems to be more in the spirit of a declaration than a solid law, how much of an affect it will have really depends on how much the citizens of Bolivia take it to heart. If “life, liberty, and the pursuit of happiness” can be so ingrained in the American subconscious, having a more pro-ecological culture in the text governing a nation may be enough to shift individual behavior for the better. Countries, like Bolivia,that have followed the path of decentralization appear to be the most likely to pass progressive laws such as this. The question is what specific laws will be passed to protect the interest of the environment, and keep foreign and big business interests in Bolivia’s resources at bay.
This diagram is more of an engineer’s view of the food web. I find that it helps to sometimes consider things as just a balance of energy and mass. It all must come from somewhere and it must all go somewhere else. With that in mind, I retrofitted some P&ID figures to represent exchange of energy and mass via the food web. I only went as far as secondary consumers, since adding tertiary consumers would have made the diagram too cluttered. I also assumed that the heat generated by producers such as plants was negligible to that produced by consumers (compare the thermal signature of a person to that of a tree). Though if someone can show me some autotrophs that generate tons of heat, I would appreciate the info. It is interesting to see how the overall balance is energy from the sun and mass from the environment equals waste mass and energy dissipated as heat (in large part). However, the intermediate steps become quite complicated and modeling them would take some mathematical juggling, the result of which could describe the mass and energy distribution in a particular biosphere.
It should be noted that I used a more in depth model to describe the consumers than the autotrophs and decomposers, due to the greater complexity of these organisms. The models themselves greatly simplify the biological processes that account for the mass and energy transfers from organism to organism.