You Can Search The Deskarati Database
- Consciousness Is the Key to Understanding Reality on
- The Descent of Kukulkán on
- The Autumn Equinox Is Almost Here on
- Everest Isn’t the Tallest Mountain on
- Can Silence Actually Drive You Crazy? on
- Deskarati Christmas Quiz on
- Deskarati Christmas Quiz on
- What is Continuous Glucose Monitoring (CGM)? on
- What is Continuous Glucose Monitoring (CGM)? on
- Leibniz and Newton calculus controversy on
Hidden life of the cell
Top Posts & Pages
Subscribe to Deskarati
Solar System to Scale
One of the most audacious space missions ever undertaken is about to come to an end. The Rosetta probe that has been tracking a comet for the past two years is going to deliberately crash itself into the 4km-wide ball of ice and dust.
European Space Agency scientists say the satellite has come to the end of its useful life and they want to get some final, ultra-close measurements. Rosetta is not expected to survive the impact with Comet 67P. But even if some of its systems remain functional, pre-loaded software on board will ensure everything is shut down on contact.
Controllers here at Esa’s operations centre in Darmstadt, Germany, will command Rosetta to change course late on Thursday.
The manoeuvre will alter its wide orbit around the duck-shaped icy wanderer and put it on a direct collision course. The probe should hit the comet’s “head” at roughly walking pace at about 11:20 GMT (12:20 BST/1320 CEST) on Friday.
The crash velocity will be low, less than a metre per second, but Rosetta was never designed to land and so various components will almost certainly be crushed as it dumps down. “Just to give you an example, if the high-gain antenna is off-pointing by more than half a degree then there is no communication possible anymore,” said Esa spacecraft operations manager Sylvain Lodiot. Source: Rosetta probe heads for comet crash
In order to understand the true nature of reality, science must first recognize the importance of consciousness, says Dr. Robert Lanza, a stem-cell biologist whose work has earned him high acclaim. He also sees a greater role for consciousness in the quest for a “Theory of Everything,” larger than even physics. According to Lanza, everything that we experience—including Newtonian physics and quantum physics—is a system created by our consciousness. Even space and time are just tools used by the mind to piece together all the information of the universe.
“Reality involves your consciousness,” said Lanza in a talk on biocentrism (see below) at the Science and Nonduality Conference 2010. “It could not be there without your consciousness.”
Biocentrism is the term that Lanza gives to this concept—that the universe arises from life, not the other way around. He writes more about this radical idea in his 2010 book, Biocentrism: How Life and Consciousness Are the Keys to Understanding the True Nature of the Universe, co-written with astronomer Bob Berman.
In his book, Lanza explores famous paradoxes of quantum physics, such as the double-slit experiment and Heisenberg’s uncertainty principle. Both of these show that the behavior of particles changes when we observe them. But why should particles care if you are watching them? Lanza says they don’t; it’s just that we are creating the reality that we are observing.
Or as Neils Bohr said: “When we measure something we are forcing an undetermined, undefined world to assume an experimental value. We are not measuring the world, we are creating it.”
Lanza also questions why life exists in the first place. According to what is known as the Goldilock’s Principle, hundreds of parameters in the universe are in the exact right range for life to survive. If you change one or more of these factors—such as the strong nuclear force or the gravitational constant—life would have never arisen.
Some possible explanations for this include that God created the universe, or that among a multitude of possible universes we happen to live in the one where life is possible. Lanza, though, says that the main reason our universe can support life is that consciousness created the parameters that have made the universe so hospitable to our existence.
The old theory of the universe has yet to sufficiently answer these fundamental questions. Lanza says that in order for science to move forward, it needs to venture into new territory.
“Science hasn’t confronted the one thing that’s most familiar and most mysterious, and that is, of course, consciousness,” he said. Source: Robert Lanza: Consciousness Is the Key to Understanding Reality
Nobel season is almost upon us again, with the scientists behind the discovery of gravitational waves up for a prize – but have you ever wondered what the first-ever Nobel prize was handed out for?
Back in 1901, the King of Sweden gave Wilhelm Röntgen the inaugural Nobel Prize in Physics for the discovery of a mysterious new type of radiation.
As the Vikki Academy video above explains, at the end of the 19th century, society’s obsession with magic had been replaced by all things electric and magnetic.
One of the most intriguing new concepts at the time was the cathode ray, which formed beams of electrons in a vacuum that could be diverted by magnets.
Fascinated with these cool little beams of electricity, Röntgen decided to mess around with them further, and put a vacuum tube containing cathode rays in a sealed box and turned out the light to see what would happen. To his surprise, some kind of energy was escaping his box and lighting up a fluorescent cardboard screen behind it. Turns out, he’d found a brand new type of radiation – one that could pass through objects.
Having no idea what these rays were at the time, he called them – you guessed it – X-rays. Source: What did the first ever Nobel prize winner discover?
El Castillo in Chichen Itza served as a temple to Kukulkan. During the spring and fall equinoxes the shadow cast by the angle of the sun and edges of the nine steps of the pyramid combined with the northern stairway and the stone serpent head carvings create the illusion of a massive serpent descending the pyramid.
Kukulkan (“Plumed Serpent“, “Feathered Serpent“) is the name of a Maya snake deity that also serves to designate historical persons. The depiction of the feathered serpent deity is present in other cultures of Mesoamerica. Kukulkan is closely related to the god Q’uq’umatz of the K’iche’ Maya and to Quetzalcoatl of the Aztecs. Little is known of the mythology of this pre-Columbian deity.
Although heavily Mexicanised, Kukulkan has his origins among the Maya of the Classic Period, when he was known as Waxaklahun Ubah Kan, the War Serpent, and he has been identified as the Postclassic version of the Vision Serpent of Classic Maya art.
The cult of Kukulkan/Quetzalcoatl was the first Mesoamerican religion to transcend the old Classic Period linguistic and ethnic divisions. This cult facilitated communication and peaceful trade among peoples of many different social and ethnic backgrounds. Although the cult was originally centred on the ancient city of Chichén Itzá in the modern Mexican state of Yucatán, it spread as far as the Guatemalan highlands.
In Yucatán, references to the deity Kukulkan are confused by references to a named individual who bore the name of the god. Because of this, the distinction between the two has become blurred. This individual appears to have been a ruler or priest at Chichen Itza, who first appeared around the 10th century. Although Kukulkan was mentioned as a historical person by Maya writers of the 16th century, the earlier 9th-century texts at Chichen Itza never identified him as human and artistic representations depicted him as a Vision Serpent entwined around the figures of nobles. At Chichen Itza, Kukulkan is also depicted presiding over sacrifice scenes.
Sizeable temples to Kukulkan are found at archaeological sites throughout the north of the Yucatán Peninsula, such as Chichen Itza, Uxmal and Mayapan. Edited from Wiki
The announcement of the card comes 16 years after SanDisk’s parent company, Western Digital, introduced the first SanDisk 64MB SD™ card, and though this new card has a much greater storage capacity, it still somehow manages to remain the same convenient size as a regular memory card.
In a press release about the memory card, Dinesh Bahal, the vice president of product management and content solutions at the business unit for Western Digital said, “Showcasing the most advanced imaging technologies is truly exciting for us.” The release states that the SanDisk 1TB SD card prototype “represents another significant achievement as the growth of high-resolution content and capacity-intensive applications such as virtual reality, video surveillance and 360 video are progressing at astounding rates.”
The 1TB card has nearly doubled the storage capacity of the company’s impressively large 512 GB SDXC card, which was also released at the Photokina photography show back in 2014.
“Just a few short years ago the idea of a 1TB capacity point in an SD card seemed so futuristic — it’s amazing that we’re now at the point where it’s becoming a reality,” SanDisk wrote.
Sam Nicholson, the CEO of Stargate Studios and member of the American Society of Cinematographers, explained in the press release that with high-capacity cards like the 1TB, not only can camera users store more images, but they can also capture them continuously without the hassle and interruption of switching cards.Since the card is still in its prototype stage, the price is unknown. However, the 512 GB SDXC card sells for a whopping $800, so we’re guessing the extra space will cost some big bucks.
As the midday sun begins to sink lower and nights get noticeably longer, it can only mean the reign of summer is coming to an end for the northern half of the world. The autumn equinox arrives at 10:21 a.m. ET (2:20 p.m. UTC) on September 22, officially marking the beginning of fall in the Northern Hemisphere and the start of spring in the Southern Hemisphere.
The word “equinox” comes from Latin and means “equal night,” referring to the roughly 12-hour day and night that occurs only on the two equinox days of the year.This tidy split in our 24-hour day is linked to the reason Earth has seasons in the first place. The planet spins on an axis that is tilted 23.5 degrees with respect to its orbital plane. That means as Earth travels along its 365-day orbit, different hemispheres tilt closer to or farther from our sun’s warming rays.
An equinox is a geometrical alignment between the sun and Earth in which the sun appears positioned right above our planet’s equator. On these days, both the Northern and Southern Hemispheres experience roughly equal amounts of sunshine. It’s also only on the spring and autumn equinoxes that the sun rises due east and sets due west.
As we head toward December, the Northern Hemisphere will tilt farther away from the sun and receive its rays at a steeper angle, creating longer shadows and cooler conditions indicative of winter. Eventually, the sun will reach its lowest point in the midday sky, marking the December solstice.
Cultures around the world have historically celebrated the dates that represent the changing of the seasons. One notable example is an ancient Maya step pyramid known as El Castillo at Chichén Itzá in Mexico. Exactly at sunset on the spring and autumn equinoxes, sunlight hits the building’s steep staircase at just the right angle to create an eerie snake-like shape that appears to slither along its length.
Somewhere in California, Google is building a device that will usher in a new era for computing. It’s a quantum computer, the largest ever made, designed to prove once and for all that machines exploiting exotic physics can outperform the world’s top supercomputers.
And New Scientist has learned it could be ready sooner than anyone expected – perhaps even by the end of next year.
The quantum computing revolution has been a long time coming. In the 1980s, theorists realised that a computer based on quantum mechanics had the potential to vastly outperform ordinary, or classical, computers at certain tasks. But building one was another matter. Only recently has a quantum computer that can beat a classical one gone from a lab curiosity to something that could actually happen. Google wants to create the first.
The firm’s plans are secretive, and Google declined to comment for this article. But researchers contacted by New Scientist all believe it is on the cusp of a breakthrough, following presentations at conferences and private meetings.
“They are definitely the world leaders now, there is no doubt about it,” says Simon Devitt at the RIKEN Center for Emergent Matter Science in Japan. “It’s Google’s to lose. If Google’s not the group that does it, then something has gone wrong.”
We have had a glimpse of Google’s intentions. Last month, its engineers quietly published a paper detailing their plans (arxiv.org/abs/1608.00263). Their goal, audaciously named quantum supremacy, is to build the first quantum computer capable of performing a task no classical computer can.
“It’s a blueprint for what they’re planning to do in the next couple of years,” says Scott Aaronson at the University of Texas at Austin, who has discussed the plans with the team.
So how will they do it? Quantum computers process data as quantum bits, or qubits. Unlike classical bits, these can store a mixture of both 0 and 1 at the same time, thanks to the principle of quantum superposition. It’s this potential that gives quantum computers the edge at certain problems, like factoring large numbers. But ordinary computers are also pretty good at such tasks. Showing quantum computers are better would require thousands of qubits, which is far beyond our current technical ability.
Instead, Google wants to claim the prize with just 50 qubits. That’s still an ambitious goal – publicly, they have only announced a 9-qubit computer – but one within reach. More here: Revealed: Google’s plan for quantum computer supremacy
The Châtelperronian period refers to one of five stone tool industries identified within the Upper Paleolithic period of Europe (ca 45,000-20,000 years ago). Once thought the earliest of the five industries, the Châtelperronian is today recognized as roughly coeval with or perhaps somewhat later than than the Aurignacian period: both are associated with the Middle Paleolithic to Upper Paleolithic transition, ca. 45,000-33,000 years ago. During that transition, the last Neanderthals in Europe died out, the result of a not-necessarily-peaceful cultural transition of European ownership from the long-established Neanderthal residents to the new influx of early modern humans from Africa.
When first described and defined in the early twentieth century, the Châtelperronian was believed to be the work of early modern humans (then called Cro Magnon), who, it was thought had descended directly from Neanderthals. The split between Middle and Upper Paleolithic is a distinct one, with great advances in the range of stone tool types and also with raw materials–the Upper Paleolithic period has tools and objects made of bone, teeth, ivory and antler, none of which was seen in the Middle Paleolithic.
The change is technology is today associated with the entrance of early modern humans from Africa into Europe.However, the discovery of Neanderthals at Saint Cesaire (aka La Roche a Pierrot) and Grotte du Renne (aka Arcy-sur-Cure) in direct association with Châtelperronian artifacts, led to the original debates: who made the Châtelperronian tools? More here: Chatelperronian Transition to Upper Paleolithic
If you bottle up a gas and try to image its atoms using today’s most powerful microscopes, you will see little more than a shadowy blur. Atoms zip around at lightning speeds and are difficult to pin down at ambient temperatures.
If, however, these atoms are plunged to ultracold temperatures, they slow to a crawl, and scientists can start to study how they can form exotic states of matter, such as superfluids, superconductors, and quantum magnets.
Physicists at MIT have now cooled a gas of potassium atoms to several nanokelvins—just a hair above absolute zero—and trapped the atoms within a two-dimensional sheet of an optical lattice created by crisscrossing lasers. Using a high-resolution microscope, the researchers took images of the cooled atoms residing in the lattice.
By looking at correlations between the atoms’ positions in hundreds of such images, the team observed individual atoms interacting in some rather peculiar ways, based on their position in the lattice. Some atoms exhibited “antisocial” behavior and kept away from each other, while some bunched together with alternating magnetic orientations. Others appeared to piggyback on each other, creating pairs of atoms next to empty spaces, or holes.
The team believes that these spatial correlations may shed light on the origins of superconducting behavior. Superconductors are remarkable materials in which electrons pair up and travel without friction, meaning that no energy is lost in the journey. If superconductors can be designed to exist at room temperature, they could initiate an entirely new, incredibly efficient era for anything that relies on electrical power.
Read more at: individual-atoms-bunch-pairs
Individual cells within a tumor are not all the same. This may sound like a modern medical truism, but it wasn’t very long ago that oncologists assumed that taking a single biopsy from a patient’s tumor would be an accurate reflection of the physiological and genetic make-up of the entire mass.
Researchers have come to realize that cancer is a disease driven by the same “survival of the fitter” forces that Darwin proposed drove the evolution of life on Earth. In the case of tumors, however, individual cells are constantly evolving as a tumor’s stage advances. Mobile cancer cells causing metastasis are a deadly outcome of this process.
Tumors also differ among patients with the same type of cancer, so how is a physician able to prescribe a tailored regimen for the patient? To start to address this conundrum, an interdisciplinary team from the Perelman School of Medicine and the Wharton School at the University of Pennsylvania developed Canopy, an approach to infer the evolutionary track of tumor cells by surveying two types of mutations – somatic copy number alterations and single-nucleotide alterations – derived from multiple samples taken from a single patient. They demonstrated the approach on samples from leukemia and ovarian cancer, along with samples from a human breast cancer cell line. Overall, the evolution of a tumor involves the accumulation of mutations of all types collectively influencing the fitness of tumor cells.
The team, Yuchao Jiang, a doctoral student in the Genomics and Computational Biology program, Yu Qiu, PhD, a postdoctoral researcher in the lab of coauthor Andy Minn, MD, PhD, an assistant professor of Radiation Oncology, and Nancy R Zhang, PhD, an associate professor of Statistics in the Wharton School, published their findings online in the early edition of the Proceedings of the National Academy of Sciences.”
The make-up of a tumor for a given patient is often a mixture of multiple distinct cell populations that differ in genetic make-up, gene expression, and physiology,” Jiang said. “This heterogeneity contributes to failures of targeted therapies and to drug resistance based on old thinking that tumors are homogenous masses.”
However, Canopy takes these differences into account because it uses data from multiple slices of the same tumor in space and time, as opposed to sampling in one spot at one time point, the way it is currently done in most sequencing studies. Canopy is an open-source software so oncologists will be able to use it to identify potential biomarkers for different cancer cell populations within tumor specimens that are associated with drug resistance and invasive malignancy, among other characteristics. Source: Penn software helps to identify course of cancer metastasis, tumor ‘evolution’
U.S. Energy Department scientists say a new method of analyzing genetic mutations in proteins in human hair could lead to the first forensic technique other than DNA profiling that could reliably match biological evidence to a single person with scientific precision.
In results published Wednesday, researchers at Lawrence Livermore National Laboratory in California said their early study — using hairs recovered from 76 living people and six sets of skeletal remains from London dating to the 1750s — shows the promise of hair “proteomics,” or the study of proteins that genes produce.
“We are in a very similar place with protein-based identification to where DNA profiling was during the early days of its development,” said Brad Hart, director of the national laboratory’s Forensic Science Center and co-author of the study with lead researcher Glendon Parker. “This method will be a game-changer for forensics,” Hart said, while cautioning that many steps remain before it is validated.
If borne out, independent experts said, hair protein analysis could address concerns about the reliability of visual comparisons of hair strands, a technique whose subjectivity has opened it to criticism that experts’ claims were frequently being overstated. Source – Has DNA met its match as a forensic tool?
DNA testing has for the first time confirmed the identity of the bacteria behind London’s Great Plague.
The plague of 1665-1666 was the last major outbreak of bubonic plague in Britain, killing nearly a quarter of London’s population.It’s taken a year to confirm initial findings from a suspected Great Plague burial pit during excavation work on the Crossrail site at Liverpool Street. About 3,500 burials have been uncovered during excavation of the site.
Testing in Germany confirmed the presence of DNA from the Yersinia pestis bacterium – the agent that causes bubonic plague – rather than another pathogen.
Some authors have previously questioned the identity of pathogens behind historical outbreaks attributed to plague. Daniel Defoe’s 18th century account of the catastrophic event in A Journal of the Plague Year described the gruesome fate of Londoners.
“The plague, as I suppose all distempers do, operated in a different manner on differing constitutions; some were immediately overwhelmed with it, and it came to violent fevers, vomitings, insufferable headaches, pains in the back, and so up to ravings and ragings with those pains,” Defoe wrote. Source: BBC News
A team of researchers with the National University of Singapore has found a way to get around what they describe as ‘Rayleigh’s curse’—a phenomenon that happens when two light sources appear to coalesce as they grow closer together, limiting the ability to measure the distance between them. In their paper published in the journal Physical Review Letters, the team describes how they applied a quantum mechanics technique to solve the problem.
For many years, scientists working in a variety of fields studying the stars through a telescope or objects through a microscope have been limited by the same problem—diffraction interfering with light sources that are very close together—the wave-like nature of light causes spreading, which in turn can cause an overlap of photons striking a surface meant to be used to measure the difference between two sources. Back in the late 1800’s, John William Strutt, Lord Rayleigh, laid down the criterion to describe such limitations and it now bears his name. In this new effort, the researchers report on a new technique they have developed that gets around this problem, allowing for measuring the distance between light sources regardless of how far apart they are.
To address the diffraction problem, the researchers applied quantum metrology and quantum optics techniques, using a hybrid of quantum mechanics and a type of statistical theory—it involves working out which measurements are likely to give the most information when measuring sources of light—even when they violate the Rayleigh criterion. The result is an estimation, but one that is believed to be extremely accurate. In so doing, they have shown that ‘Rayleigh’s curse’ is not an actual limit, but one that can be overcome. The work by the team follows an earlier effort using another technique and is different from other techniques that also overcome the Rayleigh limit that other teams have been reporting.
The researchers report that their technique is practical—devices using it will allow scientists to measure the distance between very close stars or very tiny objects that until now have not been discernable. One such area of research, they note, is fluorescence microscopy, which they believe should be a particularly good starting point. Source: Quantum mechanics technique allows for pushing past ‘Rayleigh’s curse’
Europe’s comet lander Philae has been found. The little robot is visible in new images downloaded from the Rosetta probe in orbit around the icy dirt-ball 67P/Churyumov-Gerasimenko.European Space Agency (Esa) officials say there is no doubt about the identification – “it’s as clear as day”, one told the BBC.
Philae was dropped on to the comet by Rosetta in 2014 but fell silent 60 hours later when its battery ran flat. Although it relayed pictures and data about 67P to Earth, its actually resting place was a mystery.
It was assumed the robot had bounced into a dark ditch on touchdown – an analysis now borne out by the latest pictures, which were acquired from a distance of 2.7km from the icy body.
The images from Rosetta’s high-resolution Osiris camera were downlinked to Earth late on Sunday night, and have only just been processed. The discovery comes just three weeks before controllers plan to crash-land Rosetta itself on to the comet to formally end its mission. Source: BBC News
In paleontology, biochronology is the correlation in time of biological events using fossils. In its strict sense, it refers to the use of assemblages of fossils that are not tied to stratigraphic sections (in contrast to biostratigraphy, where they are). Collections of land mammal ages have been defined for every continent except Antarctica, and most are correlated with each other indirectly through known evolutionary lineages. A combination of argon–argon dating and magnetic stratigraphy allows a direct temporal comparison of terrestrial events with climate change and mass extinctions.
Comparison with biostratigraphy
In sedimentary rocks, fossils are the only widely applicable tool for time correlation. Evolution leaves a record of progressive change, sequential and nonrepeating. A rock unit has a characteristic assemblage of fossils, independent of its lithology. Thus, the fossils can be used to compare the ages of different rock units.
The basic unit of biochronology is the biostratigraphic zone, or biozone, a collection of fossils found together in a rock unit. This is used as the basis of a biochron, “a unit of time in which an association of taxa is interpreted to have lived.” However, a biozone may vary in age from one location or another. For example, a given taxon may migrate, so its first appearance varies from place to place. In particular, facies-controlled organisms (organisms that lived in a particular sedimentary environment) are not well suited for biochronology because they move with their environment and may change little over long periods of time. Thus, biostratigraphers search for species that are particularly widespread, abundant, and not tied to particular sedimentary environments. This is particularly true of free-swimming animals such as benthic foraminifera, which readily spread throughout the world’s oceans.
Another challenge for stratigraphy is that there are often large gaps in the fossil record at a given location. To counter this, biostratigraphers search for a particularly well-preserved section that can be used as the type section for a particular biostratographic unit. As an example, the boundary between the Silurian and Devonian periods is marked by the first appearance of the graptolite Mongraptus uniformus uniformus in a section in Klonk, Czech Republic.
In terrestrial deposits, fossils of land mammals and other vertebrates are used as stratigraphic tools, but they have some disadvantages relative to marine fossils. They are seldom evenly distributed through a section, and they tend to occur in isolated pockets with few overlaps between biozones. Thus, correlations between biozones is often indirect, inferred using a knowledge of their sequence of evolution. This practice was first proposed by H. S. Williams in 1941.
In the United States, biochronology is widely used as a synonym for biostratigraphy, but in Canada and Europe the term is reserved for biochronology that is not tied to a particular stratigraphic section. This form of biochronology is not recognized by the International Stratigraphic Guide, but it is “really what a great many paleontologists and stratigraphers are after … an optimum network of fossil correlations, thought to embody a reliable and high-resolution isochronous time (lines) framework.” Source Wikipedia
Natural History Museum scientists, working as part of the Gibraltar Caves Project, excavated and studied remains of shell fish and other marine animals such as dolphins from two caves in Gibraltar where Neanderthals once lived and have discovered that Neanderthal diets were more like those of early modern humans than previously thought.
It is known that genes inherited from ancient retroviruses are essential to the placenta in mammals, a finding to which scientists in the Laboratoire Physiologie et Pathologie Moleacuteculaires des Retrovirus Endogenes et Infectieux (CNRS/Universite Paris-Sud) contributed. Today, the same scientists reveal a new chapter in this story: these genes of viral origin may also be responsible for the more developed muscle mass seen in males. Their findings are published on 2 September 2016 in PLOS Genetics.
Retroviruses carry proteins on their surface that are able to mediate fusion of their envelope with the membrane of a target cell. Once released inside that cell, their genetic material becomes integrated in the host’s chromosomes. In the rare cases where the infected cell is involved in reproduction, the viral genes may be transmitted to progeny. Thus nearly 8% of the mammalian genome is made up of vestiges of retroviruses, or “endogenous” retroviruses. Most of them are inactive, but some remain capable of producing proteins: this is the case of syncytins, proteins that are present in all mammals and encoded by genes inherited from retroviruses “captured” by their ancestors. A little more than five years ago, and thanks to inactivation of these genes in mice, the team led by Thierry Heidmann demonstrated that syncytins contribute to formation of the placenta. Because of their ancestral ability to mediate cell-cell fusion they give rise to the syncytiotrophoblast, a tissue formed by the fusion of a large number of cells derived from the embryo, at the fetomaternal interface.
Using the same mice, the team has revealed a “collateral” and unexpected effect of these proteins: they endow males with more muscle mass than females. Like the syncytiotrophoblast, muscle mass develops from fused stem cells. In the genetically-modified male mice, these fibers were 20% smaller and displayed 20% fewer nuclei than in standard males; they were then similar to those seen in females, as was their total muscle mass. It therefore appears that the inactivation of syncytins leads to a fusion deficit during muscle growth, but only in males. The scientists observed the same phenomenon in the case of muscle regeneration following a lesion: the male mice incapable of producing syncytins experienced less effective regeneration than the other males, but it was comparable to that seen in females. Furthermore, the regenerating muscle fibers produced syncytin – once again, only in males. Source: Placenta in females, muscle mass in males: The dual heritage of a virus
I have modified a few old masters and put them into a gallery of pictures with a juxtaposition of the old masters and some modern technology. Just looking at Vermeer’s beautiful ‘Girl with the Pearl Earring’ wearing a bluetooth earpiece excites our imagination. Then there is the anonymous couple in the late night diner in Hopper’s famous Nighthawks, but what is the girl doing? Perhaps she is checking facebook or reading a message from her husband asking ‘where r u’. Jim – Deskarati
Click on the thumbnails to enlarge