Amazon has been issued a patent on security measures that prevents people from comparison shopping while in the store. It's not a particularly sophisticated patent -- it basically detects when you're using the in-store Wi-Fi to visit a competitor's site and then blocks access -- but it is an indication of how retail has changed in recent years.
What's interesting is that Amazon is on the other of this arms race. As an on-line retailer, it wants people to walk into stores and then comparison shop on its site. Yes, I know it's buying Whole Foods, but it's still predominantly an online retailer. Maybe it patented this to prevent stores from implementing the technology.
It's probably not nearly that strategic. It's hard to build a business strategy around a security measure that can be defeated with cellular access.
The complete list:
I intend to start my video will this way. If I do, I will make it clear that my surroundings area rented set, so not only will none of my beneficiaries get any of the items they see in the video, bet procuring them for said video cost quite a bit of money, reducing how much there is left for anyone to inherit.
I know they won’t be happy with me, but at that point, what are they going to do?
Hey, by the way, my latest book, Run Program, is out now! It's a book about a rogue AI that has the intelligence of a child. You might think that would make the AI less dangerous, but you'd be wrong. Anyway, I'm quite proud of it. Please check it out, if you have a chance.
No interview in this week’s BSDNow, but there’s a good run through recent BSD news, including some talk about a “Aeronix” machine project, which led me to some other interesting links.
Hong Kong—a former British colony, now an autonomous territory within China—is a vibrant city of nearly 7.5 million residents, all packed into an area smaller than 425 square miles (1,100 sq km.) About 40% of the land in Hong Kong is set aside as country parks and nature reserves. As architects and developers continue to maximize the use of buildable land, apartment blocks and office towers reach for the sky, leaving Hong Kong with more skyscrapers than any other city in the world. Gathered here are recent images of the vertical cityscape, street scenes, monuments, people, and natural landscapes of Hong Kong.
Back in the 1970s, the British Interplanetary Society conceived the idea of designing a starship. The notion grew into Project Daedalus, often discussed in these pages, producing a final report that summed up what was then known about interstellar possibilities, from fusion propulsion to destination stars. Barnard’s Star, 6 light years out, became the target because at the time, it was the only star for which evidence of planets existed, though that evidence later turned out to be the result of error in the instrument being used for the observations (more on this soon, in an essay I’ve written for the Red Dots campaign. I’ll link to it as soon as it runs).
The designing of Daedalus, much of it done in London pubs, was a highly significant event. What Alan Bond, Anthony Martin, Bob Parkinson and the rest were doing was not so much putting forth something that our civilization would build as sending us a clear message. Even at this stage of our development, humans could conceive of ways to reach the stars. There is nothing in the laws of physics that prevents it. What, then, is possible in another fifty years? In a hundred? Even now, the Daedalus concept is being reimagined by Project Icarus.
I think of successive iterations in starship concepts, periodically working through the puzzles in the light of improving technologies. This is why the TU Delft Starship Team (DSTART) has caught my eye. TU Delft is the Delft University of Technology (Technische Universiteit Delft) located in a town known for its canals to the north of Rotterdam. Angelo Vermeulen, now a doctoral candidate in the systems engineering section here, is the founder of DSTART, a collection of students and researchers collaborating on an evolving starship concept.
— Angelo Vermeulen (@angelovermeulen) March 2, 2017
But Vermeulen deserves further introduction. As well as being a space systems researcher, he is a biologist and artist, and serves as what TU Delft calls a ‘community architect’ in its Participatory Systems Initiative, a determinedly multidisciplinary effort. Back in 2009, Vermeulen created Space Ecologies Art and Design (SEAD), which is an international network developing projects interrelating ecology, technology and community. He has also worked with ESA and NASA, serving as crew commander of the latter’s HI-SEAS Mars simulation in Hawaii. He has held faculty positions ranging from the University of Applied Arts in Vienna to the Philippines Open University in Los Baños. He became a TED Senior Fellow in 2013.
The idea this polymathic researcher is promoting is not to develop actual hardware but to firm up a vision of deep space exploration, one which, says Vermeulen, “…unite[s] the biological, technological and social dimensions. And it is about starships that evolve during their journey.”
This is a multidisciplinary effort drawing on astrophysics, biotechnology, computer simulation, chemistry and art. It should bring to mind Rachel Armstrong’s work on worldships, analyzed in a series of essays in her book Star Ark: A Living, Self-Sustaining Spaceship (Springer, 2017). And in fact Vermeulen was one of the authors of an essay in that volume called “The World in One Small Habitat,” in a section entitled ‘Space Architectures.’
Just how a starship could evolve on journeys taking decades and perhaps longer is fodder for DSTART’s preliminary analysis, which includes near-term in-system technologies like asteroid mining and exploitation, 3D printing and studies in closed loop life support of the kind the European Space Agency is analyzing in a project called MELiSSA. The acronym here unpacks to Micro-Ecological Life Support System Alternative, defined by ESA this way:
… the recovering of food, water and oxygen from organic waste carbon dioxide and minerals, using light as source of energy to promote biological photosynthesis. It is an assembly of processes (mechanical grinding, bioreactors, filtration, wet oxidation, etc.) aiming at a total conversion of the organic wastes and CO2 to oxygen, water and food.
It is based on the principle of an “aquatic” lake ecosystem where waste products are processed using the metabolism of plants and algae which in return provide food, air revitalization and water purification.
“The World in One Small Habitat” looks at MELiSSA in the context of other projects that are exploring closed loop life support. Mastering the huge problems of sustainable ecologies is key to getting humans deep into the Solar System and beyond.
Image: A Bussard ramjet in flight, as imagined for ESA’s Innovative Technologies from Science Fiction project. Credit: ESA/Manchu.
Centauri Dreams has many readers in The Netherlands and nearby. If you’re anywhere in range on Monday June 26, DSTART will be holding a mini-symposium to share its recent work.
16:00: Evolving asteroid starships, Angelo Vermeulen
16:20: Why interstellar exploration, Jimmy Verkooijen
16:35: Modular starship architecture, Francisco Muñoz Correa
16:55: Modeling a regenerative ecosystem, Alvaro Papic
17:15: Modeling asteroid mining and architectural evolution, Andreas Theys
17:35: The use of VR for interactive starship design, Anton Dobrevski
17:50: Concluding remarks, Angelo Vermeulen
Monday June 26, 4-6pm
Room H (Wing A), Faculty TPM, TU Delft
Jaffalaan 5, 2628 BX Delft
Everyone is welcome. Feel free to bring guests.
Places are limited.
Efforts like these encourage me because they point to increasing interest in the great themes of interstellar exploration. And it’s always heartening to see active university groups tackling these matters — Drexel University’s energetic Icarus Interstellar chapter also comes to mind.
Interstellar crossings are well beyond our capabilities at present, at least at time frames we find acceptable. But we can still come together to analyze the issues and take early steps toward their solution, helping to define problems and clarify research needs. Ad astra incrementis means exactly this: No single breakthrough but steady and progressive steps. Not that we wouldn’t all welcome breakthroughs, but the point is that persistence and hard work are what it takes to move the ball forward even as we remain open to the unexpected.
Conventional wisdom says one reason so many hackers seem to hail from Russia and parts of the former Soviet Union is that these countries have traditionally placed a much greater emphasis than educational institutions in the West on teaching information technology in middle and high schools, and yet they lack a Silicon Valley-like pipeline to help talented IT experts channel their skills into high-paying jobs. This post explores the first part of that assumption by examining a breadth of open-source data.
The supply side of that conventional wisdom seems to be supported by an analysis of educational data from both the U.S. and Russia, which indicates there are several stark and important differences between how American students are taught and tested on IT subjects versus their counterparts in Eastern Europe.
Compared to the United States there are quite a few more high school students in Russia who choose to specialize in information technology subjects. One way to measure this is to look at the number of high school students in the two countries who opt to take the advanced placement exam for computer science.
According to an analysis (PDF) by The College Board, in the ten years between 2005 and 2016 a total of 270,000 high school students in the United States opted to take the national exam in computer science (the “Computer Science Advanced Placement” exam).
Compare that to the numbers from Russia: A 2014 study (PDF) on computer science (called “Informatics” in Russia) by the Perm State National Research University found that roughly 60,000 Russian students register each year to take their nation’s equivalent to the AP exam — known as the “Unified National Examination.” Extrapolating that annual 60,000 number over ten years suggests that more than twice as many people in Russia — 600,000 — have taken the computer science exam at the high school level over the past decade.
In “A National Talent Strategy,” an in-depth analysis from Microsoft Corp. on the outlook for information technology careers, the authors warn that despite its critical and growing importance computer science is taught in only a small minority of U.S. schools. The Microsoft study notes that although there currently are just over 42,000 high schools in the United States, only 2,100 of them were certified to teach the AP computer science course in 2011.
If more people in Russia than in America decide to take the computer science exam in secondary school, it may be because Russian students are required to study the subject beginning at a much younger age. Russia’s Federal Educational Standards (FES) mandate that informatics be compulsory in middle school, with any school free to choose to include it in their high school curriculum at a basic or advanced level.
“In elementary school, elements of Informatics are taught within the core subjects ‘Mathematics’ and ‘Technology,” the Perm University research paper notes. “Furthermore, each elementary school has the right to make [the] subject “Informatics” part of its curriculum.”
The core components of the FES informatics curriculum for Russian middle schools are the following:
1. Theoretical foundations
2. Principles of computer’s functioning
3. Information technologies
4. Network technologies
6. Languages and methods of programming
8. Informatics and Society
There also are stark differences in how computer science/informatics is taught in the two countries, as well as the level of mastery that exam-takers are expected to demonstrate in their respective exams.
Again, drawing from the Perm study on the objectives in Russia’s informatics exam, here’s a rundown of what that exam seeks to test:
Block 1: “Mathematical foundations of Informatics”,
Block 2: “Algorithmization and programming”, and
Block 3: “Information and computer technology.”
The testing materials consist of three parts.
Part 1 is a multiple-choice test with four given options, and it covers all the blocks. Relatively little time is set aside to complete this part.
Part 2 contains a set of tasks of basic, intermediate and advanced levels of complexity. These require brief answers such as a number or a sequence of characteristics.
Part 3 contains a set of tasks of an even higher level of complexity than advanced. These tasks usually involve writing a detailed answer in free form.
According to the Perm study, “in 2012, part 1 contained 13 tasks; Part 2, 15 tasks; and Part 3, 4 tasks. The examination covers the key topics from the Informatics school syllabus. The tasks with detailed answers are the most labor intensive. These include tasks on the analysis of algorithms, drawing up computer programs, among other types. The answers are checked by the experts of regional examination boards based on standard assessment criteria.”
In the U.S., the content of the AP computer science exam is spelled out in this College Board document (PDF).
US Test Content Areas:
Computational Thinking Practices (P)
P1: Connecting Computing
P2: Creating Computational Artifacts
P4: Analyzing Problems and Artifacts
The Concept Outline:
Big Idea 1: Creativity
Big idea 2: Abstraction
Big Idea 3: Data and Information
Big Idea 4: Algorithms
Big idea 5: Programming
Big idea 6: The Internet
Big idea 7: Global Impact
How do these two tests compare? Alan Paller, director of research for the SANS Institute — an information security education and training organization — says topics 2, 3, 4 and 6 in the Russian informatics curriculum above are the “basics” on which cybersecurity skills can be built, and they are present beginning in middle school for all Russian students.
“Very few middle schools teach this in the United States,” Paller said. “We don’t teach these topics in general and we definitely don’t test them. The Russians do and they’ve been doing this for the past 30 years. Which country will produce the most skilled cybersecurity people?”
Paller said the Russian curriculum virtually ensures kids have far more hands-on experience with computer programming and problem solving. For example, in the American AP test no programming language is specified and the learning objectives are:
“How are programs developed to help people and organizations?”
“How are programs used for creative expression?”
“How do computer programs implement algorithms?”
“How does abstraction make the development of computer programs possible?”
“How do people develop and test computer programs?”
“Which mathematical and logical concepts are fundamental to programming?”
“Notice there is almost no need to learn to program — I think they have to write one program (in collaboration with other students),” Paller wrote in an email to KrebsOnSecurity. “It’s like they’re teaching kids to admire it without learning to do it. The main reason that cyber education fails is that much of the time the students come out of school with almost no usable skills.”
On the bright side, there are signs that computer science is becoming a more popular focus for U.S. high school students. According to the latest AP Test report (PDF) from the College Board, almost 58,000 Americans took the AP exam in computer science last year — up from 49,000 in 2015.
However, computer science still is far less popular than most other AP test subjects in the United States. More than a half million students opted for the English AP exam in 2016; 405,000 took English literature; almost 283,000 took AP government, while some 159,000 students went for an AP test called “Human Geography.”
This is not particularly good news given the dearth of qualified cybersecurity professionals available to employers. ISACA, a non-profit information security advocacy group, estimates there will be a global shortage of two million cyber security professionals by 2019. A report from Frost & Sullivan and (ISC)2 prognosticates there will be more than 1.5 million cybersecurity jobs unfilled by 2020.
The IT recruitment problem is especially acute for companies in the United States. Unable to find enough qualified cybersecurity professionals to hire here in the U.S., companies increasingly are counting on hiring foreigners who have the skills they’re seeking. However, the Trump administration in April ordered a full review of the country’s high-skilled immigration visa program, a step that many believe could produce new rules to clamp down on companies that hire foreigners instead of Americans.
Some of Silicon Valley’s biggest players are urging policymakers to adopt a more forward-looking strategy to solving the skills gap crisis domestically. In its National Talent Strategy report (PDF), Microsoft said it spends 83 percent of its worldwide R&D budget in the United States.
“But companies across our industry cannot continue to focus R&D jobs in this country if we cannot fill them here,” reads the Microsoft report. “Unless the situation changes, there is a growing probability that unfilled jobs will migrate over time to countries that graduate larger numbers of individuals with the STEM backgrounds that the global economy so clearly needs.”
Microsoft is urging U.S. policymakers to adopt a nationwide program to strengthen K-12 STEM education by recruiting and training more teachers to teach it. The software giant also says states should be given more funding to broaden access to computer science in high school, and that computer science learning needs to start much earlier for U.S. students.
“In the short-term this represents an unrealized opportunity for American job growth,” Microsoft warned. “In the longer term this may spur the development of economic competition in a field that the United States pioneered.”
On June 19, 2017, NASA held a press conference at Ames Research Center located in the heart of California’s Silicon Valley. At this event, Mario Perez (Astrophysics Division – NASA Science Mission Directorate) along with Susan Thompson (SETI Institute), Benjamin Fulton (University of Hawaii at Manoa/Caltech) and Courtney Dressing (NASA Sagan Fellow at Caltech) presented the results relating to the latest catalog of extrasolar planets found during the primary mission of NASA’s highly successful Kepler spacecraft. According to the latest tally, there are now 4,034 planet candidates identified with 2,335 of them verified by various means including follow up observations by ground-based telescopes. Of greatest importance to those interested in finding out how common life outside of our solar system might be, roughly 50 near-Earth size habitable zone candidates have been detected by Kepler with more than 30 of them currently verified (see “Habitable Zone Exoplanets from NASA’s Kepler Mission” for a summary of these earlier finds).
The primary objective of NASA’s Kepler mission is to determine how common rocky planets are in the habitable zone (HZ) with the ability to detect Earth-size planets in Earth-like orbits around Sun-like stars (i.e. true “Earth twins”) being the primary driver for the design of the spacecraft and its observation strategy. Assuming that our Earth is typical of life-bearing planets in the universe, this is the best place to start looking for habitable worlds outside of our solar system. Kepler looked for tiny dips in a star’s brightness from an orbiting planet transiting from our point of view. Since such transits only occur if the planet’s orbit is by chance aligned nearly edge-on to our line of sight, they are relatively rare. For example, a planet in an Earth-like orbit around a Sun-like star has only about a 0.5% chance of producing an observable transit. Planets in smaller orbits have proportionally higher chances of producing transits.
In order to detect a significant number of these relatively uncommon planetary transits, Kepler observed the brightness of almost 200,000 stars in a single patch of sky straddling the border of the constellations Cygnus and Lyra covering 115 square degrees over a four year primary mission which started on May 13, 2009. With the end of that primary mission coming when Kepler lost its ability to stay fixed on its target as the result of the loss of a second of its four reaction wheels, Kepler eventually moved on to an extended mission. Known as “K2”, this extended mission was designed to use the remaining pair of reaction wheels and the pressure from sunlight to maintain its attitude in order to observe a succession of star fields located along the ecliptic for three months at a time to expand the search for transiting exoplanets.
As the K2 mission continues, the Kepler science team has been using increasingly sophisticated software tools to search the data base from the primary mission to detect the sometimes subtle signature of a transiting exoplanet. The purpose of the Kepler team’s press conference on June 19, 2017 was to present the results from the eighth catalog of Kepler finds as the project wraps up its work on the primary mission. For this latest release, members of the Kepler science team reprocessed the entire data set from Kepler’s primary mission using their latest tools. In addition, they performed simulations to determine how complete their survey results were – a necessary step to convert the raw statistics derived from the Kepler catalog into useful information on the occurrence rate of various sized exoplanets.
In the course of assembling this new catalog, the Kepler science team found another 239 planet candidates – objects of interest which require follow up observations in order to verify their planetary nature. Of these, ten are roughly Earth-size objects orbiting inside the habitable zone of their host stars. Table 1 below summarizes the key properties of these ten recently announced Kepler planet candidates identified by their KOI designations (Kepler Object of Interest). These planet candidates have radii less than twice that of the Earth (or 2 RE) and orbit within an optimistic definition of the habitable zone (HZ). All of the data in this table are taken from the Data Release 25 (DR25) data set found in the NASA Exoplanet Archive. The amount of energy each candidate planet receives from its sun, or stellar flux, was calculated from those data.
|KOI Number||Period (days)||Orbit Radius (AU)||Planet Radius (Earth=1)||Stellar Flux (Earth =1)|
While the Kepler science team has purposely used the most optimistic assessment of habitability to cast as wide a net as possible for targets for future study by scientists, realistically what are the chances that any of these planet candidates are actually habitable given what we know about them?
A thorough assessment of the habitability of any extrasolar planet would require a lot of detailed data on the properties of that planet, its atmosphere, its spin state, the evolution of its volatile content and so on. Unfortunately, at this very early stage, the only information typically available to scientists about extrasolar planets are basic orbit parameters, a rough measure of its size and some important properties of its sun. Combined with extrapolations of the factors that have kept the Earth habitable over billions of years (not to mention why our neighbors are not currently habitable), the best we can hope to do at this time is to compare the known properties of extrasolar planets to our current understanding of planetary habitability to determine if an extrasolar planet is “potentially habitable”. And by “habitable”, I mean in an Earth-like sense where the surface conditions allow for the existence of liquid water on the planet’s surface – one of the presumed prerequisites for the development of life as we know it. While there may be other worlds that might possess environments that could support life, these would not be Earth-like habitable worlds of the sort being considered here.
The first step in assessing the potential habitability of any exoplanet is to determine what sort of world it is: is it a rocky planet like the Earth or is it a volatile-rich mini-Neptune? Although examples of this newly recognized class of planet are of interest to scientists (in part because they are absent in our solar system), mini-Neptunes have poor prospects of being habitable in an Earth-like sense. If we know the radius and mass of an exoplanet, its mean density can be readily calculated which in turn can be used to constrain its bulk composition. While the radii of transiting exoplanets can be determined from an analysis of the transit light curves, unfortunately the mass can not be directly determined. Other methods, such as the analysis of precision radial velocity measurements, are needed to do that – a process that will take time assuming that the host star is bright enough and the orbiting planet massive enough for current instruments to detect.
Without any information on the masses of Kepler’s new batch of planet candidates currently available, we are forced to rely on statistical arguments based on the observed mass-radius relationship of other exoplanets whose radii and masses have been measured. A series of analyses of Kepler data and follow-up observations published over the last several years have shown that there are limits on how large a rocky planet can become before it starts to possess increasingly large amounts of water, hydrogen and helium as well as other volatiles making the planet a Neptune-like world. Rogers has shown that planets have a higher probability of being mini-Neptunes at a radius of no greater than 1.6 RE although 1.5 RE seems more probable (see “Habitable Planet Reality Check: Terrestrial Planet Size Limits”).
A more recent analysis of the mass-radius relationship with a much larger collection of exoplanetary data by Chen and Kipping suggests that that the gradual transition from rocky to volatile-rich exoplanets starts at about 1.2 RE again with the probability that a planet is rocky decreasing with increasing radius. Hints of this transition in exoplanet populations from primarily rocky to volatile-rich exoplanets is evident in the statistical analysis of Kepler finds which shows comparatively few exoplanets in the 1.5 to 2.0 RE size range. This result suggests that once rocky planets reach a radius of 1.5 RE, they become massive enough to retain more volatiles including a puffy envelope of hydrogen which allows them to jump the size gap to radii of 2.0 RE and greater. Only the largest rocky planets and the smallest mini-Neptunes fill this gap.
The next criterion that can be used to determine if a rocky exoplanet is potentially habitable is the amount of energy it receives from its sun known as the effective stellar flux or Seff. According to the work by Kopparapu et al. (2013, 2014) on the limits of the habitable zone (HZ) based on detailed climate and geophysical modeling, the inner limit of the HZ for an Earth-like rocky planet is conservatively defined by the runaway greenhouse limit where a planet’s temperature would soar even with no CO2 present in its atmosphere resulting in the loss of all of its water in a geologically brief time in the process. For an Earth-size planet orbiting a Sun-like star, this happens at an Seff value of 1.11 times that of Earth. For stars with effective surface temperatures lower than the Sun’s 5780 K, this Seff value for the inner limit of the HZ becomes lower because more of the stars’ energy is radiated in the infrared where there are numerous atmospheric absorption bands which help to heat the atmosphere. For planets more massive than the Earth, this Seff limit becomes somewhat higher because the atmospheric column is compressed by the planet’s higher gravity.
As a planet receives less energy from its sun, various processes such as the carbonate-silicate cycle allow more CO2 to build up in the atmosphere which helps to increase the greenhouse effect and maintain surface temperatures. The outer limit of the conservative HZ, as defined by Kopparapu et al. (2013, 2014), corresponds to the maximum greenhouse limit beyond which a CO2-dominated greenhouse is incapable of maintaining a planet’s surface temperature. Instead of helping to heat the atmosphere, the addition of more CO2 beyond this point makes the atmosphere more opaque causing the surface temperatures to drop instead of increase. The latest work suggests an Seff value of about 0.36 for the outer limit of the HZ of a Sun-like star with cooler stars having slightly lower values. There are some slightly more optimistic definitions of the outer edge of the HZ such as the early-Mars scenario or evoking some sort of super-greenhouse where gases other than just CO2 contribute to warming a planet. But these more optimistic definitions do not change the Seff for the outer limit of the HZ significantly.
With these two criteria based on readily observable properties of transiting planets available, it is possible to make an initial assessment of the potential habitability of the latest planet candidates found by the Kepler science team.
This exoplanet candidate is part of the Kepler 123 system which was discovered in 2014 to contain two confirmed super-Earth size exoplanets in short-period orbits of 17.23 and 26.70 days. Since the host star has a luminosity of about 1.82 times that of the Sun, Kepler 123b and c would have Seff values many tens of times greater than that of the Earth and could not be habitable. While the new planet candidate in this system, designated KOI 238.03, orbits much farther from Kepler 123 than its two confirmed neighbors, unfortunately its prospects for being habitable are still not very promising.
With an effective temperature of 6086±133 K, the inner edge of the HZ around Kepler 123 has an Seff of 1.23 for a 5 ME planet, according to Kopparapu et al. (2013, 2014). With a Venus-like Seff of 1.81, KOI 238.03 is far outside of the conservatively defined HZ. In addition, the 1.96 +0.33/-0.29 RE radius of this planet candidate makes it highly unlikely that it is a rocky planet like the Earth but is a volatile-rich mini-Neptune instead. Taken together, it seems that KOI 238.03 has very poor prospects for being a habitable exoplanet given our current knowledge of its properties.
With a radius of 1.19 +0.08/-0.16 RE, KOI 7706.01 is right on the 1.2 RE border found by Chen and Kipping where the population of exoplanets begins to transition from being rocky to volatile-rich. At the same time it is still well below the radius found by Rogers where the majority of exoplanets would be expected to be volatile-rich. While there is some possibility that KOI 7706.01 is a mini-Neptune, the odds seem to heavily favor it being a rocky planet like the Earth.
Issues seem to arise when considering the Seff of this planet candidate. For an Earth-mass planet orbiting KOI 7706 with an effective temperature of 4281 +115/-140 K, the conservatively defined inner edge of the HZ is found at an Seff value of 0.96, based on Kopparapu et al. (2013, 2014). The calculated Seff for KOI 7706.01 of 2.00 is over twice that value. However, given the tight orbit of this planet candidate around its comparatively dim host star, it seems likely that KOI 7706.01 will be a synchronous rotator with the same side always facing its sun. Increasingly detailed climate models over the last two decades have shown that not only is synchronous rotation not an impediment to habitability but that the inner edge of the HZ can be much closer to the sun than it is for fast rotators like the Earth. This is because feedback mechanisms promote the formation of clouds on the perpetual daylit side of the planet which reflects away energy to help moderate the surface temperature.
A recent model by Yang et al. suggests that the inner edge of the HZ of a slow or synchronous rotator has an Seff of 1.83. More recent work by Kopparapu et al. (2016) which makes more realistic assumptions about the rotation rate and its effects on global circulation suggests a similar value. While the Seff of 2.00 for KOI 7706.01 is still about 9% greater than this more optimistic definition of the inner edge of the HZ, the uncertainty in the parmeters leading to this this Seff value is large enough to suggest that this candidate may straddle the inner edge of the HZ. Until the properties of KOI 7706.01 can be more precisely determined, it can be considered a fair candidate for being a potentially habitable planet. No matter what kind of world this turns out to be, a more detailed characterization of this exoplanet’s properties would allow scientists to probe the limits of planetary habitability.
KOI 7711.01 is definitely one of the better candidates for being potentially habitable. The host star seems to be a slightly smaller version of the Sun with 62% of its luminosity. While the radius of 1.31 +0.34/-0.12 RE is well into the transition region between rocky and volatile-rich planets, the odds still seem to favor KOI 7711.01 being a rocky planet. The Seff of 0.87 is much lower than 1.10 value for the conservatively defined inner edge of the HZ for an Earth-mass planet orbiting a star with an effective temperature of 5734±154 K strongly suggesting that it orbits comfortably inside of the HZ. Among this recent batch of Kepler planet candidates, KOI 7711.01 comes the closest to being considered a true “Earth twin” – i.e. an Earth-size planet in an Earth-like orbit around a Sun-like star. It certainly deserves closer scrutiny once its planetary nature is confirmed.
Like KOI 7711.01, KOI 7882.01 has a radius of 1.31 RE suggesting that the odds favor it being a rocky planet. But with a smaller uncertainty in that measurement, +0.08/-0.12 RE compared to +0.34/-012, it would seem that it is slightly less likely that KOI 7882.01 is a mini-Neptune thus improving its habitability prospects at least in this regard. For a star like KOI 7882 with an effective temperature of 4348±130 K, the inner edge of the conservatively defined HZ for an Earth-mass planet found by Kopparapu et al. (2013, 2014) would have a Seff of 0.96 which is much smaller than the current calculated value of 1.79. But like 7706.01, this exoplanet is likely to be a synchronous rotator. Based on the model by Yang et al., the inner edge of the HZ for a slow or synchronous rotator would have an Seff of 1.85. This places KOI 7882.01 just inside of the HZ for this type of exoplanet. It would appear that this planet candidate has some good prospects for being potentially habitable and would be the type of target scientists could use to probe the true limits of the HZ.
With an Earth-like Seff of 0.97, KOI 7894.01 seems to orbit comfortably inside the HZ whose inner edge would correspond to an Seff of 1.14 for a host star with an effective temperature of 5995 +163/-181 K. Unfortunately, with a radius currently estimated to be 1.62 +0.49/-0.15 RE, it seems that this Kepler planet candidate is more likely to be a mini-Neptune with poor prospects of being habitable like the Earth. While it is certainly worth watching, it would seem that KOI 7894.01 is not likely to be potentially habitable.
Of the latest batch of Kepler planet candidates, KOI 7923.01 would seem to have the best prospects of being potentially habitable. Its radius of 0.97 +0.12/-0.10 RE is essentially identical to that of the Earth (to within current measurement uncertainties) and is therefore likely to be rocky like the Earth. For a host star like KOI 7923 with about 40% of the Sun’s luminosity and an effective temperature of 5060 +192/-174 K, the conservatively defined HZ for an Earth-mass planet would span Seff values from 1.02 out to 0.31 at the outer edge using the model of Kopparapu et al. (2013, 2014). With a Seff calculated to be 0.44, KOI 7923.01 is comfortably inside the outer part of its sun’s HZ. Although the Seff is lower than that of the Earth, it would seem that KOI 7923.01 is the closest any current Kepler find, confirmed or otherwise, has come to being considered a true Earth twin. As such, this planet candidate deserves further attention.
With an Seff currently estimated to be 0.69, KOI 7954.01 seems to orbit comfortably inside of its sun’s conservative HZ which, with a Sun-like effective temperature of 5769 +155/-172 K, spans Seff values from 1.11 to 0.36 for an Earth-mass planet. Unfortunately, this Kepler planet candidate with a radius of 1.74 +0.46/-0.14 RE is most likely a uninhabitable mini-Neptune. While certainly worthy of study for what it can tell us about this newly recognized class of exoplanet, KOI 7954.01 seems to have poor prospects for being habitable in an Earth-like sense.
The prospects for KOI 8000.01 being habitable are even poorer than those of KOI 7954.01. With a radius of 1.70 +0.43/-0.14 RE, this planet candidate is most likely a mini-Neptune with little chance of being habitable. With a Seff of 1.20, KOI 8000.01 seems to orbit right at the edge of the HZ which, for its host star with an effective temperature of 5663 +169/-152 K, has an Seff of 1.17 for a 5 ME planet as calculated by Kopparapu et al. (2013, 2014). Again, it seems that KOI 8000.01 has poor prospects for being habitable although it is certainly worth additional study to characterize this new class of exoplanet.
With a Mars-like radius of 0.42 +0.17/-0.12 RE, KOI 8012.01 is the smallest planet candidate being considered here. This candidate also has a Mars-like Seff of 0.37 which places it just inside of its host star’s HZ which, with an effective temperature of 3374 +112/-82 K, has an Seff of 0.24 at its outer limit, according to the models of Kopparapu et al. (2013, 2014). Unlike far too many other exoplanets which some have claimed to be habitable but are too big, this Kepler planet candidate may be too small to be habitable, if Mars in our own solar system is to serve as any sort of a guide. While there are questionable prospects for KOI 8012.01 being potentially habitable due to its small size, it would be an ideal target for probing the lower size limits of planetary habitability.
Based on an initial assessment, it would seem that KOI 8174.01 has better prospects for being habitable than the comparatively diminutive KOI 8012.01. Its measured radius of 0.64 ±0.07 RE places this candidate between being Earth-size and Mars-size possibly giving this world an edge on staving off the worst effects of atmosphere loss the Red Planet has experienced. The Seff of 0.70 also places this candidate comfortably in the middle of the host star’s HZ which, with a effective temperature of 5332 +160/-144 K, spans Seff values from 1.05 out to 0.33 for an Earth-mass planet. Given what little we know about sub-Earth size planets, this will be another ideal target for further examination and, at first blush, seems to have fairly good prospects for being potentially habitable.
When considering this list of ten planet candidates, it needs to be remembered that the Kepler science team purposely used an optimistic assessment of what constitutes a habitable planet. This was done, in part, in response to the uncertainties in the properties of these exoplanets but also because of the limitations of the current models of planetary habitability. The team’s goal was to include all planet candidates that had any chance of being habitable to serve as an input for a “short list” of targets for future investigators.
In this review, I have taken a different approach of trying to focus instead on those worlds which have reasonably realistic chances of being potentially habitable given what we know about them and using more conservative extrapolations of what it takes for a planet to maintain Earth-like habitable conditions. Based on this more conservative approach, it appears that planet candidates KOI 238.03, KOI 7894.01, KOI 7954.01 and KOI 8000.01 are probably volatile-rich mini-Neptunes instead of being rocky like the Earth. While this recently identified class of exoplanet is certainly worth detailed study, they have very poor prospects for being habitable in the Earth-like sense.
At the other end of the size spectrum, the planet candidate KOI 8012.01 seems more likely to be a slightly smaller version of Mars potentially sharing our neighbor’s habitability issues. The larger KOI 8174.01 falls neatly between Earth and Mars in size and, like KOI 8012.01, orbits comfortably inside the habitable zone (HZ) of its sun. More detailed study of these candidates promises to shed light on the lower mass limit of habitable planets.
The planet candidates 7706.01 and 7882.01 are likely to be synchronous rotators which orbit right at the inner edge of the HZ for slowly rotating planets. Studies of these worlds will also help scientists probe the limits of habitability in this part of parameter space. KOI 7711.01 and especially KOI 7923.01 come the closest to being Earth twins on this list of new Kepler planet candidates. These Earth-size candidates orbit comfortably inside of the conservatively defined HZ of their Sun-like host stars – exactly the kind of world that NASA’s Kepler mission was designed to detect.
But before we invest too much into any assessment about the potential habitability of these KOIs, it must be remembered that they are currently only planet candidates whose planetary status must be confirmed by time-consuming follow up observations. While the current tools for processing Kepler data are constantly improving, there is the possibility that some of these planet candidates are false positives with some other natural phenomenon or subtle instrumental artifact mimicking the signature of a transiting planet. This is of increasing concern as the limits of the hardware, data and software are pushed to extract ever smaller and more difficult to detect transiting planets.
It is also possible that some of these planet candidates may remain unconfirmed for years to come. Potential Earth-twin planet candidates KOI 2194.03 and KOI 5737.01 were first discussed publicly during an astronomical conference in January 2015 but remain unconfirmed 2½ years later (see “Earth Twins on the Horizon?”). It is also quite possible that some of these planet candidates will be confirmed but, as a result of the more detailed assessments afforded by follow up observations, the properties of the host star will be updated. Any such changes would inevitably trickle down to alter the derived properties of the now confirmed exoplanet so that it may no longer be considered potentially habitable. Although the Kepler science team is formally winding up its work on the primary mission’s data set, there will be years if not decades of additional work required to characterize Kepler’s discoveries, potentially habitable or otherwise.
Follow Drew Ex Machina on Facebook.
Here is the video from NASA’s June 19, 2017 press conference presenting the results from the release of the eighth Kepler catalog.
In addition to the articles cited above, there is an ever-growing list of articles on Drew Ex Machina related to the results from NASA’s Kepler mission. A complete list of these articles can be found on this web site’s Kepler mission page.
Jingjing Chen and David Kipping, “Probabilistic Forecasting of the Masses and Radii of Other Worlds”, The Astrophysical Journal, Vol. 834, No. 1, Article id. 17, January 2017
R. K. Kopparapu et al., “Habitable zones around main-sequence stars: new estimates”, The Astrophysical Journal, Vol. 765, No. 2, Article ID. 131, March 10, 2013
Ravi Kumar Kopparapu et al., “Habitable zones around main-sequence stars: dependence on planetary mass”, The Astrophysical Journal Letters, Vol. 787, No. 2, Article ID. L29, June 1, 2014
Ravi Kumar Kopparapu et al., “The Inner Edge of the Habitable Zone for Synchronously Rotating Planets around Low-mass Stars Using General Circulation Models”, The Astrophysical Journal, Vol. 819, No. 1, Article ID. 84, March 2016
Leslie A. Rogers, “Most 1.6 Earth-Radius Planets are not Rocky”, The Astrophysical Journal, Vol. 801, No. 1, Article id. 41, March 2015
Jun Yang et al., “Strong Dependence of the Inner Edge of the Habitable Zone on Planetary Rotation Rate”, The Astrophysical Journal Letters, Vol. 787, No. 1, Article ID L2, May 2014
NASA Releases Kepler Survey Catalog with Hundreds of New Planet Candidates, NASA Press Release 17-056, June 19, 2017 [Press Release]
The N.S.A. failed to consistently lock racks of servers storing highly classified data and to secure data center machine rooms, according to the report, an investigation by the Defense Department's inspector general completed in 2016.
The agency also failed to meaningfully reduce the number of officials and contractors who were empowered to download and transfer data classified as top secret, as well as the number of "privileged" users, who have greater power to access the N.S.A.'s most sensitive computer systems. And it did not fully implement software to monitor what those users were doing.
In all, the report concluded, while the post-Snowden initiative -- called "Secure the Net" by the N.S.A. -- had some successes, it "did not fully meet the intent of decreasing the risk of insider threats to N.S.A. operations and the ability of insiders to exfiltrate data."
Marcy Wheeler comments:
The IG report examined seven of the most important out of 40 "Secure the Net" initiatives rolled out since Snowden began leaking classified information. Two of the initiatives aspired to reduce the number of people who had the kind of access Snowden did: those who have privileged access to maintain, configure, and operate the NSA's computer systems (what the report calls PRIVACs), and those who are authorized to use removable media to transfer data to or from an NSA system (what the report calls DTAs).
But when DOD's inspectors went to assess whether NSA had succeeded in doing this, they found something disturbing. In both cases, the NSA did not have solid documentation about how many such users existed at the time of the Snowden leak. With respect to PRIVACs, in June 2013 (the start of the Snowden leak), "NSA officials stated that they used a manually kept spreadsheet, which they no longer had, to identify the initial number of privileged users." The report offered no explanation for how NSA came to no longer have that spreadsheet just as an investigation into the biggest breach thus far at NSA started. With respect to DTAs, "NSA did not know how many DTAs it had because the manually kept list was corrupted during the months leading up to the security breach."
There seem to be two possible explanations for the fact that the NSA couldn't track who had the same kind of access that Snowden exploited to steal so many documents. Either the dog ate their homework: Someone at NSA made the documents unavailable (or they never really existed). Or someone fed the dog their homework: Some adversary made these lists unusable. The former would suggest the NSA had something to hide as it prepared to explain why Snowden had been able to walk away with NSA's crown jewels. The latter would suggest that someone deliberately obscured who else in the building might walk away with the crown jewels. Obscuring that list would be of particular value if you were a foreign adversary planning on walking away with a bunch of files, such as the set of hacking tools the Shadow Brokers have since released, which are believed to have originated at NSA.
Read the whole thing. Securing against insiders, especially those with technical access, is difficult, but I had assumed the NSA did more post-Snowden.
I have no comment on the politics of this stabbing attack, and only note that the attacker used a ceramic knife -- that will go through metal detectors.
I have used a ceramic knife in the kitchen. It's sharp.
EDITED TO ADD (6/22): It looks like the knife had nothing to do with the attack discussed in the article.
What happens when an unstoppable shrimp meets an unmovable senator? A researcher goes to Washington to defend herself, her shrimp, and science itself.
(Image credit: Klaus Stiefel/Flickr)
Last week, Microsoft issued a security patch for Windows XP, a 16-year-old operating system that Microsoft officially no longer supports. Last month, Microsoft issued a Windows XP patch for the vulnerability used in WannaCry.
Is this a good idea? This 2014 essay argues that it's not:
The zero-day flaw and its exploitation is unfortunate, and Microsoft is likely smarting from government calls for people to stop using Internet Explorer. The company had three ways it could respond. It could have done nothing -- stuck to its guns, maintained that the end of support means the end of support, and encouraged people to move to a different platform. It could also have relented entirely, extended Windows XP's support life cycle for another few years and waited for attrition to shrink Windows XP's userbase to irrelevant levels. Or it could have claimed that this case is somehow "special," releasing a patch while still claiming that Windows XP isn't supported.
None of these options is perfect. A hard-line approach to the end-of-life means that there are people being exploited that Microsoft refuses to help. A complete about-turn means that Windows XP will take even longer to flush out of the market, making it a continued headache for developers and administrators alike.
But the option Microsoft took is the worst of all worlds. It undermines efforts by IT staff to ditch the ancient operating system and undermines Microsoft's assertion that Windows XP isn't supported, while doing nothing to meaningfully improve the security of Windows XP users. The upside? It buys those users at best a few extra days of improved security. It's hard to say how that was possibly worth it.
This is a hard trade-off, and it's going to get much worse with the Internet of Things. Here's me:
The security of our computers and phones also comes from the fact that we replace them regularly. We buy new laptops every few years. We get new phones even more frequently. This isn't true for all of the embedded IoT systems. They last for years, even decades. We might buy a new DVR every five or ten years. We replace our refrigerator every 25 years. We replace our thermostat approximately never. Already the banking industry is dealing with the security problems of Windows 95 embedded in ATMs. This same problem is going to occur all over the Internet of Things.
At least Microsoft has security engineers on staff that can write a patch for Windows XP. There will be no one able to write patches for your 16-year-old thermostat and refrigerator, even assuming those devices can accept security patches.
Happy first day of summer! Last night at 11:24 p.m. CDT, the sun reached the top of its roller coaster ride across the sky, marking the summer solstice. That makes today the first full day of the new season. Around here it coincides with the start of what will soon be a riot of daises. A favorite flower, they seem to glow brighter than any other at dusk. Take a look next time you’re outside at dusk, cooling off after a hot day.
There’s an astronomical reason for the heat. It’s the sun’s altitude. Every year, sometime between June 20 and 22, the sun reaches its furthest point north in the sky. If you live in the northern hemisphere, that means the sun’s as high in the sky as it goes. Not only do longer days increase the heat, but the steep angle gives the sun’s rays greater intensity. Every celestial object that climbs high in the sky rises early, remains visible for many hours and sets late. Objects low in the sky like Saturn or the winter sun follow a short arc and are only up half as long.
With the sun high, today’s will be the longest day of the year and tonight the shortest night. True night, without a trace of dusk or dawn light, is a precious commodity the further north you live. Here in northern Minnesota twilight ends about 11:45 p.m. and resumes at 2:45 p.m. for a grand total of 3 hours of twilight-less night. Along the U.S.-Canadian border you’ll find a hint of twilight all night long low in the northern sky.
The sun’s arc never dips very far below the northern horizon in the summertime, so those living in the north can still sense its presence by longer and longer twilights. If you head up to the Arctic Circle (latitude 66.5° north), the sun won’t set at all. For a few nights, its scrapes along all 360° of the horizon for a midnight sun experience of a lifetime. The North Pole at 90° north latitude experienced its first midnight sun on the spring solstice. Now, on the first day of summer, the sun’s more than two fists high day and night, circling round and round. How I would miss the night!
We all know what happens once we reach our peak. Decline inevitably follows for most of us provided we live long enough (a small price to pay for a long life?). So too with the sun. It’s got nowhere else to go after today but the downslope of the roller coaster. Just like the coaster, it accelerates slowly at first before hitting its stride with the approach of fall.
All this up and down year after year is a reflection of the tilt of Earth’s axis. When angled toward the sun (see diagram above), the sun appears high in the sky. When angled away, it rides low. High, low, high, low year after year after year — it’s plain to see we’re never getting off this roller coaster. As they say, enjoy the ride!
The European Space Agency has just announced the official adoption of the PLATO mission. The untangled acronym — PLAnetary Transits and Oscillations of stars — tells us that, like Kepler and CoRoT, this is a planet hunting mission with asteroseismological implications. Photometric monitoring of nearby bright stars for planetary transits and determination of planetary radii should help build our target list for spectroscopic follow-up as we delve into planetary atmospheres looking for biosignatures. Launch is scheduled for 2026.
Asteroseismology studies how stars oscillate, giving us information about the internal structure of the star that would not be available through properties like brightness and surface temperature. PLATO will be carrying out high precision photometric monitoring at visible wavelengths, targeting bright stars (mV ≤ 11), though with capabilities for fainter stars down to magnitude 16. Several hundred thousand stars will ultimately be characterized in the search for planets around G-class stars like the Sun, subgiants and red dwarf stars.
Keep in mind one important difference between PLATO and Kepler. The latter worked with a starfield that was in most cases quite faint, down to magnitude 17. No one can argue with the success of Kepler and its recently released final catalog, but the faintness of many of its stars meant that many planetary candidates have proven difficult to follow up and confirm. PLATO’s wide field will allow monitoring of the brightest stars, which should allow smaller planets to be followed up by ground-based instruments and ultimately confirmed.
“Using observations of stellar vibrations, PLATO will for the first time fully characterize these stars and their planets with regard to mass, radius, and age”, says Prof. Dr. Laurent Gizon, director of the Max Planck Institute for Solar System Research and head of the PLATO Data Center. “This will revolutionize the study of the evolution of exoplanets and their host stars… With today’s adoption, the implementation – the actual building and construction of the spacecraft and its instruments – can begin. In parallel, the design of the software to analyse the observations will be developed at the PLATO Data Center.”
Image: Artist’s impression of one of the new worlds that PLATO will discover. Among those there will be Earth-like planets around Sun-like stars with the potential to host life. Credit & copyright: MPS/ Mark A. Garlick (markgarlick.com).
Bear in mind, too, that unlike the upcoming TESS (Transiting Exoplanet Survey Satellite) mission, PLATO will have the ability not only to do asteroseismology (quite useful in determining the size of exoplanets it discovers), but also to look for Earth-like planets around stars like our G-class Sun. TESS should give us plentiful information about planets around M-dwarfs, but PLATO backs us out to Earth-like worlds around stars like our own.
Using 34 separate small telescopes and cameras, PLATO’s current plan calls for a four-year observation period consisting of long-duration observations of two sky fields lasting two years each. But this could change: ESA also charts an alternative course including a single, three-year long-duration phase and a one year phase with several different pointings. The final strategy is not to be decided until two years before launch, but depending on the choice, the mission will cover between 10 percent and 50 percent of the sky during the nominal mission. An extended mission of up to four years will be a built-in option.
Orbiting the L2 Lagrangian point, PLATO should be able to characterize numerous rocky, icy or giant planets at unprecedented levels of precision, with radius measurements down to 3 percent and mass determinations better than 10 percent precision, assembling a catalog of confirmed and characterized planets with known mean densities, compositions and evolutionary stages, some of them being, it is assumed, planets in the habitable zone.
Image: The PLATO-team at the Max Planck Institute for Solar System Research. Credit & copyright: MPS.
ESA’s point about planetary evolution is telling, especially in light of the work we looked at yesterday on mini-Neptunes and the ‘gap’ between them in Kepler data and smaller, rocky worlds. The focus on highly accurate radius and mass measurements and better determination of stellar ages is designed to give us a better understanding of planetary changes over time.
Terrestrial planets can lose their primordial hydrogen atmospheres, later developing secondary atmospheres and perhaps life. A key goal for PLATO, then, is to study the physical and dynamical processes on Earth-like planets at different epochs in the life of a stellar system, with asteroseismology being a key determinant of host star ages. ESA believes asteroseismology can now determine a star’s age to a precision of 10 percent.
And, of course, we continue the quest for planets in the habitable zone of their star. PLATO’s long-duration observations of a star field should allow the mission to capture two transits of any Earth twin it finds around a G-class star. “With this concept and the high precision of the instrument we will find rocky planets orbiting sunlike stars and will be able to characterise them accurately.” says Heike Rauer (DLR-Berlin), principal Investigator of the mission.
The US National Highway Traffic Safety Administration (NHTSA) is proposing a requirement that every car should broadcast a cleartext message specifying its exact position, speed, and heading ten times per second. In comments filed in April, during the 90-day comment period, we (specifically, Leo Reyzin, Anna Lysyanskaya, Vitaly Shmatikov, Adam Smith, together with the CDT via Joseph Lorenzo Hall and Joseph Jerome) argued that this requirement will result in a significant loss to privacy. Others have aptly argued that the proposed system also has serious security challenges and cannot prevent potentially deadly malicious broadcasts, and that it will be outdated before it is deployed. In this post I focus on privacy, though I think security problems and resulting safety risks are also important to consider.
The basic summary of the proposal, known as Dedicated Short Range Communication (DSRC), is as follows. From the moment a car turns on and every tenth of a second until it shuts off, it will broadcast a so-called “basic safety message” (BSM) to within a minimum distance of 300m. The message will include position (with accuracy of 1.5m), speed, heading, acceleration, yaw rate, path history for the past 300m, predicted path curvature, steering wheel angle, car length and width rounded to 20cm precision, and a few other indicators. Each message will also include a temporary vehicle id (randomly generated and changed every five minutes), to enable receivers to tell whether they are hearing from the same car or from different cars.
Under the proposal, each message will be digitally signed. Each car will be provisioned with 20 certificates (and corresponding secret keys) per week, and will cycle through these certificates during the week, using each one for five minutes at a time. Certificates will be revocable; revocation is meant to guard against incorrect (malicious or erroneous) information in the broadcast messages, though there is no concrete proposal for how to detect such incorrect information.
It is not hard to see that if such a system were to be deployed, a powerful antenna could easily listen to messages from well over the 300m design radius (we’ve seen examples of design range being extended by two or three orders of magnitude through the use of good antennas with bluetooth and wifi). Combining data from several antennas, one could easily link messages together, figuring out where each car was parked, what path it took, and where it ended up. This information will often enable one to link the car to an individual–for example, by looking at the address where the car is parked at night.
The fundamental privacy problem with the proposal is that messages can be linked together even though they have no long-term ids. The linking is simplest, of course, when the temporary id does not change, which makes it easy to track a car for five minutes. When the temporary id changes, two consecutive messages can be easily linked using the high-precision position information they contain. One also doesn’t have to observe the exact moment that the temporary id changes: it is possible to link messages by a variety of so-called “quasi-identifiers,” such as car dimensions; position in relation to other cars; the relationship between acceleration, steering wheel angle, and yaw, which will differ for different models; variability in how different models calculate path history; repeated certificates; etc. You can read more about various linking methods in our comments; and in comments by the EFF.
Thus, by using an antenna and a laptop, one could put a neighborhood under ubiquitous real-time surveillance — a boon to stalkers and burglars. Well-resourced companies, crime bosses, and government agencies could easily surveill movements of a large population in real time for pennies per car per year.
To our surprise, the NHTSA proposal did not consider the cost of lost privacy in its cost-benefit analysis; instead, it considered only “perceived” privacy loss as a cost. The adjective “perceived” in this context is a convenient way to dismiss privacy concerns as figments of imagination, despite the fact that NHTSA-commissioned analysis found that BSM-based tracking would be quite easy.
What about the safety benefits of proposed technology? Are they worth the privacy loss? As the EFF and Brad Templeton (among others) have argued, the proposed mandate will take away money from other safety technologies that are likely to have broader applications and raise fewer privacy concerns. The proposed technology is already becoming outdated, and will be even more out of date by the time it is deployed widely enough to make any difference.
But, you may object, isn’t vehicle privacy already dead? What about license plate scanners, cell-phone-based tracking, or aerial tracking from drones? Indeed, all of these technologies are a threat to vehicle privacy. None of them, however, permits tracking quite as cheaply, undetectably, and pervasively. For example, license-plate scanners require visual contact and are more conspicuous that a hidden radio antenna would be. A report commissioned by NHTSA concluded that other approaches did not seem practical for aggregate tracking.
Moreover, it is important to avoid the fallacy of relative privation: even if there are other ways of tracking cars today, we should not add one more, which will be mandated by the government for decades to come. To fix existing privacy problems, we can work on technical approaches for making cell phones harder to track or on regulatory restrictions on the use of license plate scanners. Instead of creating new privacy problems that will persist for decades, we should be working on reducing the ones that exist.
Last week, the Department of Justice released 18 new FISC opinions related to Section 702 as part of an EFF FOIA lawsuit. (Of course, they don't mention EFF or the lawsuit. They make it sound as if it was their idea.)
There's probably a lot in these opinions. In one Kafkaesque ruling, a defendant was denied access to the previous court rulings that were used by the court to decide against it:
...in 2014, the Foreign Intelligence Surveillance Court (FISC) rejected a service provider's request to obtain other FISC opinions that government attorneys had cited and relied on in court filings seeking to compel the provider's cooperation.
The provider's request came up amid legal briefing by both it and the DOJ concerning its challenge to a 702 order. After the DOJ cited two earlier FISC opinions that were not public at the time -- one from 2014 and another from 2008 -- the provider asked the court for access to those rulings.
The provider argued that without being able to review the previous FISC rulings, it could not fully understand the court's earlier decisions, much less effectively respond to DOJ's argument. The provider also argued that because attorneys with Top Secret security clearances represented it, they could review the rulings without posing a risk to national security.
The court disagreed in several respects. It found that the court's rules and Section 702 prohibited the documents release. It also rejected the provider's claim that the Constitution's Due Process Clause entitled it to the documents.
This kind of government secrecy is toxic to democracy. National security is important, but we will not survive if we become a country of secret court orders based on secret interpretations of secret law.
I’ve tried several different types of razor, and found that they all workd equally well, if the goal is to lacerate my face-meat.
Two different razor manufacturers ended up sending me free samples of their products to review after this comic ran. I found that I was perfectly capable of cutting my cheeks and neck to ribbons regardless of what razor I used. I’m currently using a Panasonic electric razor, with which I’m very happy. I’ll admit that if you keep up with replacing the blades and screens it doesn’t work out to be all that much cheaper than disposables, but I lose a lot less blood, and I get to use a gadget!
Hey, by the way, my latest book, Run Program, is out now! It's a book about a rogue AI that has the intelligence of a child. You might think that would make the AI less dangerous, but you'd be wrong. Anyway, I'm quite proud of it. Please check it out, if you have a chance.
With several new missions launched it is time for an update to the Space Observatories page, the x-ray telescope NICER on the ISS and the Chinese HXMT (Hard X-ray Modulation Telescope) or as it is now known, Huìyǎn (慧眼, Insight), likewise an X-ray mission.
In other news, the source data are now available on my space exploration history github repository, together with the included infographics.
For immediate release
Today, the government introduced Bill C-59 “An Act respecting national security matters” and the BCCLA welcomes some important and long-needed “fixes” but remains disappointed that so much of “Bill C-51” is still intact.
Micheal Vonn, Policy Director of the BCCLA: “We are very pleased that the government is introducing an all-agency review body for national security. This is long overdue and urgently needed. That said, this is a complex re-arrangement and we can’t say at this time whether all the pieces are in place. There are many questions that will need to be answered, particularly with regards to the powers and mandate of the Intelligence Commissioner. But in the main, this proposal is moving in the right direction.”
The BCCLA also welcomes the amendments to the terrorism speech offences.
The Association has less positive responses to other aspects of the bill, such as changes to the Security of Canada Information Sharing Act (SCISA), the “no-fly” regime, and warrants for CSIS “disruption powers”.
“The bill does several things to try to reign in the unprecedented surveillance powers created by SCISA, but as no credible justification for SCISA was ever made, it would have been much better to repeal it and introduce any clarifying amendments required in the federal Privacy Act. While the bill’s provisions on information sharing are an improvement, they speak to tinkering as opposed to reform, “ said Vonn.
“Likewise, we see clear attempts to improve the dreadful lack of due process protections in the no-fly scheme, but the fundamental flaws of the scheme remain. There is still too little transparency in the decision-making process and too many ways to postpone providing a remedy to people unjustly affected. As for the warrant provisions for the use of CSIS “disruption powers” which were cited by many legal experts to be unconstitutional, the bill doubtless improves the terrible state of the law that currently exists, but legitimate concerns remain, including the threshold question of the discretion to apply for the warrants,” Vonn added.
Micheal Vonn, Policy Director, email@example.com
The post BCCLA reacts to new national security legislation (Bill C-59) appeared first on BC Civil Liberties Association.
In recognition of the warmer weather in the northern hemisphere, this is an invitation to take some time, go outside, and set a kite aloft. Collected here, some delightful images of kites in flight around the world from the past century.
The other day, I wrote about Fuco1’s efforts to add some context awareness to Emacs font-locking. Now he’s back with a new font-locking problem. This time, he wants to highlight interpolated variables in quoted strings in shell code. Those of you familiar with the Unix way will recall that there are two situations: variables will be interpolated in double-quoted strings but not in single-quoted strings. Fuco1 wants to distinguish the two cases by highlighting the first case but not the second. Thus we want
Foo = "bar" String1 = "We want highlighting for $Foo in this string." String2 = 'But no highlighting for $Foo is this string.'
This is another case where the font-locking has to be context aware: we want it in a double-quoted string but not in a single-quoted string so the context of where the interpolated variable appears matters.
As Fuco1 said in his original post, you can substitute a function for the normal regular expression controlling font-locking as long as the function has the same interface and returns as
re-search-forward. Check out Fuco1’s post for how he solved the problem. If you, like Fuco1, have a refined sense of style in such matters, you can install his code and get his results yourself.
As announced yesterday at NASA Ames, the Kepler team has released the final Kepler catalog from the spacecraft’s first four years of data and its deep stare into Cygnus. The numbers still impress me despite our having watched them grow with each new report: We have 4034 planet candidates, of which 2335 have been verified as exoplanets. More than 30 of the approximately 50 near-Earth sized habitable zone candidates have been verified.
The new release brought us 219 new candidates, 10 of them habitable zone possibilities, giving us a final catalog that is our first take on the prevalence and characteristics of planets in the Milky Way, and paving the way for future space-based instruments as we look for targets for atmospheric characterization and direct imaging. By introducing simulated planet transit signals and adding known false signals, the researchers were able to tighten up the catalog, ensuring against errors in the analysis growing out of the team’s processing methods.
The Kepler data are also the subject of a new take on planetary demographics, as seen in the image below, which takes us into the realm of the kind of planets we do not see in our Solar System, so-called ‘mini-Neptunes’ ranging from 2 to 3.5 times the size of Earth.
Image: This diagram illustrates how planets are assembled and sorted into two distinct size classes. First, the rocky cores of planets are formed from smaller pieces. Then, the gravity of the planets attracts hydrogen and helium gas. Finally, the planets are “baked” by the starlight and lose some gas. At a certain mass threshold, planets retain the gas and become gaseous mini-Neptunes; below this threshold, the planets lose all their gas, becoming rocky super-Earths. Credit: NASA/Kepler/Caltech (R. Hurt).
A research group working under Andrew Howard (Caltech) has drilled deep into the dataset, measuring the size of 1300 stars in the Kepler field of view to determine the radii of 2025 Kepler planets with four times more precision than had previously been achieved. Out of this work on planetary sizes comes the finding that these planets can be classified into two distinct groups: Rocky Earth-like planets and mini-Neptunes. From the paper:
We find evidence for a bimodal distribution of small planet sizes. Sub-Neptunes and super-Earths appear to be two distinct planet classes. Planets tend to prefer radii of either ∼1.3 R⊕ or ∼2.4 R⊕, with relatively few planets having radii of 1.5–2.0 R⊕. Planets in the gap have the maximum size for a rocky core, as seen in previous studies of bulk planet density and of ultra-short period planets. We posit that the bimodal planet radius distribution stems from differences in the envelope masses of small planets. While our current dataset is insufficient to distinguish between theoretical models that produce the gap, it charts a path forward to unraveling further details of the properties of the galaxy’s most abundant planets.
Although most of the Kepler planets have proven to be between the size of the Earth and Neptune, the planets were previously thought to span the range between the two rather than to fall into distinct groupings. Erik Petigura (Caltech) is a co-author of the new study, which will appear in The Astronomical Journal, along with two other papers that make up an observational program known as the California-Kepler Survey (see citations below):
“In the solar system, there are no planets with sizes between Earth and Neptune. One of the great surprises from Kepler is that nearly every star has at least one planet larger than Earth but smaller than Neptune. We’d really like to know what these mysterious planets are like and why we don’t have them in our own solar system.”
Image: Researchers using data from the W. M. Keck Observatory and NASA’s Kepler mission have discovered a gap in the distribution of planet sizes, indicating that most planets discovered by Kepler so far fall into two distinct size classes: the rocky Earths and super-Earths (similar to Kepler-452b), and the mini-Neptunes (similar to Kepler-22b). This histogram shows the number of planets per 100 stars as a function of planet size relative to Earth. Credit: NASA/Ames/Caltech/University of Hawaii (B. J. Fulton).
The Keck work involved researchers at UC Berkeley, Harvard, the University of Hawaii, Princeton and the University of Montreal as well as Caltech, a multi-year project analyzing the Kepler spectral data to obtain precise measurements of the host stars, which then allowed the scientists to refine the sizes of the planets orbiting those stars. Why the planets exist in two distinct groups — with a clear gap between rocky Earths and mini-Neptunes — remains problematic. Are Earth-sized planets the norm, with some of them simply obtaining enough of a gas envelope to become mini-Neptunes? Andrew Howard comments:
“A little bit of hydrogen and helium gas goes a very long way. So, if a planet acquires only 1 percent of hydrogen and helium in mass, that’s enough to jump the gap. These planets are like rocks with big balloons of gas around them. The hydrogen and helium that’s in the balloon doesn’t really contribute to the mass of the system as a whole, but it contributes to the volume in a tremendous way, making the planets a lot bigger in size.”
On the other hand, the researchers speculate, it may be that planets without enough of a gas envelope to fall into the gap can lose gas because of radiation from the host star. In this scenario, planets that wind up in the gap aren’t likely to stay there for long, producing thin atmospheres that are quickly blown off. Either scenario produces the populations we see in the new data, but we’ll need to learn more about the composition of mini-Neptunes to get a picture of why they seem to form so easily around other stars but not around our Sun.
The paper adds this:
…making a planet with a thin atmosphere requires a finely tuned amount of H/He. Second, photoevaporating a planet’s envelope significantly changes its size. Our observation of two peaks in the planet size distribution is consistent with super-Earths being rocky planets with atmospheres that contribute negligibly to their size, while sub-Neptunes are planets that retain envelopes with mass fractions of a few percent.
Three papers from the California-Kepler Survey are now available on the arXiv site in preprint form, all of them accepted for publication at The Astronomical Journal: Petigura et al., “The California-Kepler Survey. I. High Resolution Spectroscopy of 1305 Stars Hosting Kepler Transiting Planets” (preprint); Johnson et al., “The California-Kepler Survey. II. Precise Physical Properties of 2025 Kepler Planets and Their Host Stars” (preprint); and Fulton et al., “The California-Kepler Survey. III. A Gap in the Radius Distribution of Small Planets” (preprint).
Org-mode is a fantastic way to organise information in simple text files. I often use internal links to other sections in a document for navigation, but I’ve found that there is not a great mechanism to quickly insert an internal link. I wanted an interface that would provide me a list of headlines in the document that I could use to select a link target, so I put together the functions below.
These simple functions leverage the fantastic ivy and worf packages, so install them first. Then put the code below into your emacs config file. Then, invoking
M-x bjm/worf-insert-internal-link provides an
ivy completion interface for the list of headlines in the document. Once you select the headline you want as the link target, the link is inserted for you.
;; use ivy to insert a link to a heading in the current document ;; based on `worf-goto` (defun bjm/worf-insert-internal-link () "Use ivy to insert a link to a heading in the current `org-mode' document. Code is based on `worf-goto'." (interactive) (let ((cands (worf--goto-candidates))) (ivy-read "Heading: " cands :action 'bjm/worf-insert-internal-link-action))) (defun bjm/worf-insert-internal-link-action (x) "Insert link for `bjm/worf-insert-internal-link'" ;; go to heading (save-excursion (goto-char (cdr x)) ;; store link (call-interactively 'org-store-link) ) ;; return to original point and insert link (org-insert-last-stored-link 1) ;; org-insert-last-stored-link adds a newline so delete this (delete-backward-char 1) )
It’s an all female crew tomorrow at dawn. Look low in the northeastern sky starting about an hour and a half before sunrise and the first thing you’ll notice is the lunar crescent filled out with ample earthshine. To its left will shine radiant Venus, the brightest planet in the sky. Both Venus and the moon or Luna have deep mythological roots. Venus was the Roman goddess of love and beauty, while Luna was the female complement of the sun. She rode a chariot wearing a crescent-shaped crown on her head.
The moon and Venus won’t be in conjunction but they’ll still be close enough together to get your attention, about 7.5° apart. Once you’ve got your bearings, look a fist and a half (17°) to the left of Venus and you might be able to catch sight of the returning Pleiades star cluster. The little bunch of stars shaped like a miniature Big Dipper is also called the Seven Sisters, after the seven daughters of the Greek gods Atlas and Pleione.
With dawn swelling by the minute, you’ll need binoculars to show the Pleiades at their best. The sight is a special one. After gracing the winter and early spring evening sky, the cluster disappeared from view in May, lost in the glare of the daylight. Now, in late June, it returns reborn, draped in the colors of dawn. Delicate and flickering, the stars seem to struggle to “stand up” to the light. Seeing the Sisters also reminds us that the winter stars are already on the move in the east — a crazy thought to have in your head on the eve of the summer solstice.
Over the weekend, wildfires in central Portugal killed at least 63 people and injured 135 others, many of them killed while trapped in their cars. More than 1,600 firefighters are still battling the fires, which are believed to have been triggered by lightning strikes during a recent heat wave. Portugal called for three days of mourning to be observed for the victims of one of the most deadly forest fires in its recent history.
Xah Lee has a new page out in his Emacs Lisp Tutorial that serves as a 10 minute introduction to text properties. Text properties are one of those things that you probably won’t need to fiddle with directly unless you are writing a (major or minor) mode but you often see them mentioned in such commands as
buffer-substring-no-properties, which is a common function that anyone might use to write Elisp that manipulates text. So even if you don’t use them directly, it’s nice to know what they are and how they work.
The tutorial just gives you a flavor of what you can do with properties but it links to the full documentation if you really want all the details. It’s pretty easy to deal with the properties, as you’ll see when you read the tutorial.
I haven’t written about Lee’s tutorial for some time but I thought this new page was interesting and that others might enjoy it too. Lee has done a lot of work on the tutorial since I last mentioned it and it looks pretty good and is easy to navigate. One thing I especially like is that hovering over a function will pop up the doc string for that function so it’s easy to follow the action if an example uses a function you’re not familiar with.
If you work off-line a lot or just want to help out, you can buy a copy of both the Emacs and Emacs Lisp tutorials for $25.
The 2016 election was one of the most eventful in U.S. history. We will be debating its consequences for a long time. For those of us who pay attention to the security and reliability of elections, the 2016 election teaches some important lessons. I’ll review some of them in this post.
First, though, let’s review what has not changed. The level of election security varies considerably from place to place in the United States, depending on management, procedures, and of course technology choices. Places that rely on paperless voting systems, such as touchscreen voting machines that record votes directly in computer memories (so-called DREs), are at higher risk, because of the malleability of computer memory and the lack of an auditable record of the vote that was seen directly by the voter. Much better are systems such as precinct-count optical scan, in which the voter marks a paper ballot and feeds the ballot through an electronic scanner, and the ballot is collected in a ballot box as a record of the vote. The advantage of such a system is that a post-election audit that compares a random sample of paper ballots to the corresponding electronic records can verify with high confidence that the election results are consistent with what voters saw. Of course, you have to make the audit a routine post-election procedure.
Now, on to the lessons of 2016.
The first lesson is that nation-state adversaries may be more aggressive than we had thought. Russia took aggressive action in advance of the 2016 U.S. election, and showed signs of preparing for an attack that would disrupt or steal the election. Fortunately they did not carry out such an attack–although they did take other actions to influence the election. In the future, we will have to assume the presence of aggressive, highly capable nation-state adversaries, which we knew to be possible in principle before, but now seem more likely.
The second lesson is that we should be paying more attention to attacks that aim to undermine the legitimacy of an election rather than changing the election’s result. Election-stealing attacks have gotten most of the attention up to now–and we are still vulnerable to them in some places–but it appears that external threat actors may be more interested in attacking legitimacy.
Attacks on legitimacy could take several forms. An attacker could disrupt the operation of the election, for example, by corrupting voter registration databases so there is uncertainty about whether the correct people were allowed to vote. They could interfere with post-election tallying processes, so that incorrect results were reported–an attack that might have the intended effect even if the results were eventually corrected. Or the attacker might fabricate evidence of an attack, and release the false evidence after the election.
Legitimacy attacks could be easier to carry out than election-stealing attacks, as well. For one thing, a legitimacy attacker will typically want the attack to be discovered, although they might want to avoid having the culprit identified. By contrast, an election-stealing attack must avoid detection in order to succeed. (If detected, it might function as a legitimacy attack.)
The good news is that steps like adopting auditable paper ballots and conducting routine post-election audits are useful against both election-stealing and legitimacy attacks. If we have strong evidence of voter intent, this will make election-stealing harder, and it will make falsified evidence of election-stealing less plausible. But attacks that aim to disrupt the election process may require different types of defenses.
One thing is certain: election workers have a very difficult job, and they need all of the help they can get, from the best technology to the best procedures, if we are going to reach the level of security we need.
'("\\*sly-macroexpansion\\*" . emacs))
The Pale Red Dot campaign that discovered Proxima Centauri b produced one of the great results of exoplanet detection. For many of us, the idea that a world of roughly Earth mass might be orbiting in Proxima Centauri’s habitable zone — where liquid water can exist on the surface — was almost too good to be true, and it highlighted the real prospect that if we find such a planet around the closest star to our own, there must be many more around similar stars. Hence the importance of learning more about our closest neighbors.
Which is why it’s so heartening to see that Pale Red Dot is by no means done. This morning, the team led by Guillem Anglada-Escudé (Queen Mary University, London) announced plans to acquire data from the European Southern Observatory’s HARPS instrument (High Accuracy Radial velocity Planet Searcher) in a new campaign to study not just Proxima Centauri in search of further planets, but also the red dwarfs Barnard’s Star and Ross 154.
Also involved will be a network of small telescopes performing photometric monitoring, including the Las Cumbres Global Observatory Telescope network, SpaceObs ASH2 in Chile, the Observatorio de Sierra Nevada and the Observatorio Astronómico del Montsec, both in Spain. But the star of the show continues to be HARPS, a high-precision spectrograph attached to the ESO’s 3.6-meter telescope at La Silla. HARPS is capable of detecting radial velocity motions down to 3.5 kilometers per hour, the pace of a leisurely evening walk.
Image: Lead author Guillem Anglada-Escudé speaking at a press conference in Garching, Germany about the 2016 discovery of Proxima Centauri b. Credit: ESO/M. Zamani.
Are there other planets around Proxima Centauri? The findings around TRAPPIST-1, all seven of them, give reason to hope that we’ll make further discoveries. As to Barnard’s Star, we still have no information about planets there, although for a time in the mid-20th Century, it was thought due to instrument error that there might be one or more gas giants orbiting the star. We know now that that isn’t the case, but the possibility of terrestrial-class worlds remains.
I’ll have more to say about that situation later in the week, but do want to note that the reason the Project Daedalus planners chose Barnard’s Star as their mission target was the supposition that those planets existed. Proxima Centauri would obviously have been a closer target.
Image: Barnard’s Star ca. 2006. Credit: Steve Quirk.
The M-dwarf Ross 154 is just under 10 light years from Earth, the nearest star in Sagittarius. That distance is closing at a good clip (in astronomical terms), so that the star will come to within about 6.4 light years in another 157,000 years. Like Proxima Centauri, Ross 154 is a UV Ceti-type flare star, producing major flares on the order of every two days. Given that flare activity in M-dwarfs is a major factor in the question of whether life can develop, finding planets close enough to be characterized by later space and ground telescopes would be a significant development, and the more systems the better to allow comparative analysis.
It will be fascinating to watch the new Pale Red Dot campaign develop, for these observations will be highly visible to the general public. While Pale Red Dot presented its results on Proxima b to the public only after extensive peer review, the observational data from the new campaign, beginning with Proxima Centauri, will be revealed and discussed in real time.
— ESO (@ESO) June 19, 2017
The scientists involved intend to maintain an active social media presence supported by various online tools. Keep an eye on the Red Dots Facebook page, the Red Dots Twitter account and the #reddots hashtag, as well as the main project page, where updates and featured contributions from the community will be posted on a regular basis.
I have finally put my interactive timeline of Solar System Exploration History on GitHub as an open repository, including all data and graphics. Don't look at the code too closely though, it has grown over a very long time and that is how it looks, like a shrub. Well, making it accessible is at least one incentive to improve it.
I'll also put up some documentation in the readme and several overview graphics, such as the ground systems associated with spacecraft operations above. All of it is accessible viathe page on this blog and on my github.io site.
Access Now has documented it being used against a Twitter user, but it also works against other social media accounts:
With the Doubleswitch attack, a hijacker takes control of a victim's account through one of several attack vectors. People who have not enabled an app-based form of multifactor authentication for their accounts are especially vulnerable. For instance, an attacker could trick you into revealing your password through phishing. If you don't have multifactor authentication, you lack a secondary line of defense. Once in control, the hijacker can then send messages and also subtly change your account information, including your username. The original username for your account is now available, allowing the hijacker to register for an account using that original username, while providing different login credentials.
Conventional wisdom would have us believe that stars form in extremely powerful and ordered magnetic fields. But “conventional,” our universe is not (as Yoda might say).
In a new and fascinating study published in Astrophysical Journal Letters and carried out by astronomers using the Atacama Large Millimeter/submillimeter Array (ALMA) in Chile, a star some 1,400 light-years away in the Serpens star-forming region had its magnetic field gauged.
The star, called Ser-emb 8, is embedded inside the magnetic field passing through the molecular cloud it was born in. As the surrounding dust aligns itself with the direction of these magnetic field lines, ALMA is able to make precise measurements of the polarization of the emissions produced by this dust. From these incredibly sensitive measurements, a map of the polarization of light could be created, providing a view of the magnetic nest the star was born in.
And this nest is an unexpected one; it’s a turbulent region lacking the strong and ordered magnetism that would normally be predicted to be in the immediate vicinity of Ser-emb 8. Previous studies have shown newborn stars to possess powerful magnetic fields that take on an “hourglass” shape, extending from the protostar and reaching light-years into space. Ser-emb 8, however, is different.
“Before now, we didn’t know if all stars formed in regions that were controlled by strong magnetic fields. Using ALMA, we found our answer,” said astronomer Charles L. H. “Chat” Hull, at the Harvard-Smithsonian Center for Astrophysics (CfA) in Cambridge, Mass. “We can now study magnetic fields in star-forming clouds from the broadest of scales all the way down to the forming star itself. This is exciting because it may mean stars can emerge from a wider range of conditions than we once thought.”
By comparing these observations with computer simulations, an insightful view of the earliest magnetic environment surrounding a young star has been created.
“Our observations show that the importance of the magnetic field in star formation can vary widely from star to star,” added Hull in a statement. “This protostar seems to have formed in a weakly magnetized environment dominated by turbulence, while previous observations show sources that clearly formed in strongly magnetized environments. Future studies will reveal how common each scenario is.”
Hull and his team think that ALMA has witnessed a phase of star formation before powerful magnetic fields are generated by the young star, wiping out any trace of this pristine magnetic environment passing through the star forming region.
What do amateur astronomers look at in their telescopes? Planets, double stars, asteroids, comets and deep sky objects. Deep sky is shorthand for galaxies, stellar birth clouds called nebulae and star clusters. You’re probably familiar with a few like the Orion Nebula, the Pleiades star cluster (a.k.a. the Seven Sisters) and Andromeda Galaxy.
Tonight, we’re going to look at a juicy one, an object known simply as M4, the fourth deep sky object in a catalog of 110 objects compiled by 18th century astronomer Charles Messier. Messier was obsessed with hunting for comets but kept running into other glowy stuff — galaxies, nebulae, etc. — that he mistook for his favorite fuzzies. To avoid confusion, he cataloged the comet imposters. Ironically, despite his passion for comet hunting and the discovery of 13 comets, his catalog has become the source of his fame today. All beginning and amateur astronomers cut their teeth hunting up Messier objects, many of which comprise the brightest and best the deep sky has to offer.
Summer brings tons of stuff to look at through the telescope because the brightest swath of the Milky Way rises in the east this time of year and remains well-placed for deep sky mining all season and into the next. One of the brightest and easiest Messier objects is the globular cluster M4 in the constellation Scorpius the Scorpion.
Globulars, pronounced GLOB-you-ler, are the stellar equivalent of a busy beehive. Containing anywhere from tens of thousands to 10 million stars, they’re densely packed into the shape of a sphere. Most globulars don’t rotate like the Earth does, and the stars move randomly, affected by the gravitational pull of its neighbors. But they all hang together as big popcorn balls in the outer halo of the Milky Way galaxy, taking eons to revolve above its core. We know of 152 of them in our galaxy and have spotted thousands more in others.
The closest at 7,200 light years and also one of the brightest is M4. So easy to find, you simply have to identify the bright, red-orange star Antares in Scorpius. In mid-June, Antares is up 2 fists high in the south-southeastern sky around 10:30 p.m. Three fists up if you live in the southern U.S. While the cluster is visible as a faint spot with the naked eye under the darkest skies, even pair of 35mm binoculars will pop it into view just 1.3° west (right) of Antares.
Find Antares and then focus the star sharply in binoculars. Now look a short distance just to its right and a little below. Assuming you’re observing from reasonably dark skies (not downtown!) you should see a fuzzy patch of haze like a star that doesn’t come to focus. That’s it! Binoculars aren’t powerful enough to resolve the cluster into individual stars. Instead they blend together into a uniform glow.
But the reality is the cluster spans some 50 light years and contains more than 20,000 stars. We see it as it was over 7,000 years ago about the time the wheel was invented and humans began experimenting with writing. Yes, it’s still there in 2017! Stars in globular clusters live lives measuring in many billions of years, so a few thousand hardly makes a difference.
Close and bright, M4 was the first globular cluster in which individual stars were resolved. Even a 6-inch telescope will show them heaped like tiny sugar crystals in the bowl of a spoon on moonless nights. I encourage telescope owners to shoot over to M4 after looking at Saturn. It’s a must-see deep sky object. When you bring up the magnification to 150x or higher, a row of brighter stars runs north-south across the center of the cluster. It looks like a zipper to my eyes.
Giant telescopes uncover additional treasures within this starry disco ball including the 12-13 billion-year-old white dwarfs (burned out stars), the oldest so far discovered in the galaxy and a city-sized neutron star spinning at the rate of 300 times a second and beaming powerful pulses of radiation into space.
It’s all there for you to see and ponder on breezy summer nights.
Yet again, sadly, this comic is based on an actual conversation. A coworker of mine told our supervisor he was smarter than her, and was surprised when she didn’t take “the news” well.
This comic was made way back when one would have to “go get” a video camera. Now, many of us just happen to have HDTV cameras on us at all times, attached to the supercomputers we use as phones. You never know what’s going to date a comic.
Hey, by the way, my latest book, Run Program comes out Tuesday, the 20th! It's a book about a rogue AI that has the intelligence of a child. You might think that would make the AI less dangerous, but you'd be wrong. Anyway, I'm quite proud of it. Please check it out, if you have a chance.
“Help! Help! That stylish re-imagining of urban masculinity is taking my baby!“
The weather’s nice but the links aren’t stopping!
Your unrelated comics link of the week: Mister Hayden Comics.
I worry every spring about whether the fireflies will return. Habitat loss from urban sprawl and light pollution are thought to be behind the decline in their numbers in recent years. When you use light as the language of love, bright porch lighting, streetlights and headlights can make it difficult for a potential mate to sense your intentions.
But I’m happy to report that they only showed up late. Two nights ago, the flying beetles punctuated the Summer Triangle and Milky Way with glowing commas, periods, tildes and dashes in languages unique to each species. A mix of fields and woods along with warm evening temperatures bring them out in great numbers. June and July are the best months.
There’s a certain irony in the green fire in fireflies’ bellies. Look around the sky and try to find a green star. You’ll search in vain. Lots of stars emit green light including the sun, but they also give off light of every other color, too.
Funny thing. If you measure how much light the sun emits in each color, you’d find infrared (heat), red, blue, purple and every color imaginable. But of all the colors, it radiates most strongly in yellow-green! So why doesn’t it look like a blindingly bright firefly? Because it’s also sending out substantial amounts of blue, yellow, orange and red light that combined together appear ‘white’ to our eyes.
Ultimately, it comes down to how the cone cells in the retina perceive color. There are three different kinds: those sensitive to red, those to blue and those to green. An apple looks red because the red cones respond strongly to red light while the blue and green ones don’t. When the signal from the trio goes to our brain we see a ‘red’ apple.
If the green and red cones are active but blue isn’t, we see yellow. To perceive green, the object must be strongly emitting only green light. Since the sun and stars like it also emit red and blue, all three types of cones fire up and we perceive white. We’re grateful for the fireflies as substitutes for the green stars we can never see.
Fireflies combine oxygen from the air with other chemicals in its light-producing organ to create a “cold” light that’s much more efficient that a typical incandescent light bulb. The process is called bioluminescence. When the firefly wants to light up, it adds oxygen to the mix. In a poetic circle, oxygen derives from the evolution of massive stars which create more complex elements by combining simpler ones in the heated, high-pressure environment of their cores.
In a process that’s taken more than 10 billion years, oxygen has followed a tortuous path from the bellies of stars into the abdomens of fireflies. To look up on a summer night is to see the cosmic in every tiny, living flash.
If you use Org mode and Babel, you know that by calling
org-edit-special (bound to Ctrl+c ’ by default) you’re put in a separate buffer that has the mode of the source block you were working in. That’s really convenient because you get syntax highlighting, proper indentation, and all the other benefits of being in a programming mode.
If you do a lot of coding in source blocks, you may find it inconvenient to always be switching into the
org-edit-special mode. John Kitchin does a great deal of coding this way and decided to make his life easier by adding a keymap to the source block itself so that he could get the advantages of the programming mode without having to switch to the special buffer.
I use Org Babel a lot but I’m happy switching to the other buffer. Perhaps if I were doing it as much as Kitchin, I’d feel differently. If you would like to avoid the
org-edit-special buffer, take a look at Kitchin’s post and the accompanying video.
It was at that moment Mimi realized her father had lost his mind and their nightly family bedtime story was nothing more than the owner’s manual for a Keurig Model B70 coffee maker.
-Greg Pembroke of @reasonsmysoniscrying
If you don’t read appliance manuals to your kids, who will?
Like many people in ops-adjacent parts of the internet, I had a lot of feels around this reddit post, where a new software developer trying to follow instructions to set up his development environment accidentally borked the production database and was summarily fired as a result. Dr. Richard Cook of SNAFUCatchers wrote a piece in response looking at different ways that organizations can respond to failure. The reddit story is a pretty clear example of what Dr. Cook calls “blame and train”, but he notes that this is not the optimal reaction to failure as it tends to lead to organizational brittleness. More positive reactions to failure can lead instead towards organizational resilience, and Dr. Cook encouraged me to share a story of what that could look like.
So, let me tell you all about the time I borked Etsy.
Once upon a time, back in 2016, I needed to provision some new servers for something I was doing. Due to some fun idiosyncrasies of our provisioning setup, I ran into the not-totally-uncommon case where our yum repo had synced a newer version of a package from upstream than the version that we had pinned in Chef. This means that boxes will get provisioned ok, but then Chef will fail on the first run due to refusing to downgrade to the older version that we have pinned. Normally what we do in these situations is test the new package to make sure it doesn’t break anything, and then pin that newer version in Chef.
In this case, the package that was causing this problem was apache. “Fine,” I said to myself, “it’s a point release newer, I’ve tested it, it should only affect new boxes anyways, I’ll just push out the new version in Chef and get on with my provisioning.” So I did. And I logged onto a random web host to watch the chef run and make sure it did the right thing (which was supposed to be nothing) and… it did not do the right thing.
First of all, it installed the new package, which it was for sure not supposed to do, because our Chef recipe explicitly told it not to. Then it restarted apache, because it had to, and apache didn’t restart cleanly. That’s when I knew that that afternoon was about to get a lot more interesting. The first thing I did was turn to my colleague sitting next to me, who happened to be the ops engineer on call at the time. “Heyyyyyy friend,” I said, “I’m pretty sure you’re about to get paged a bunch because I just set apache on fire.” Then I hopped into our #sysops channel on Slack (which at the time was The Cool Place To Be for all your “what’s up with the site” questions) and let people there know what was going on.
The immediate response from everyone around was to ask, “What help do you need?” We brainstormed some ideas. The first thought was, since apache hadn’t come back up cleanly after the upgrade, was to downgrade again. But remember how that particular version was breaking provisioning because it wasn’t in our repos anymore? Yup, meaning we couldn’t roll back to the old version because it was gone. Someone hopped on another random web host and ran Chef again, and discovered that apache started up fine after a second Chef run. While a couple people started digging in to figure out why, some more of us coordinated using dsh to force a Chef run everywhere. Others were keeping an eye on graphs and logs – and we realized that while the site did get very slow for the ~10 minutes this was actually going on, it didn’t actually go down because a few servers hadn’t run Chef for whatever reason so hadn’t gotten the upgraded version in the first place.
It was a beautiful scene of empathetic and coordinated chaos. People were using Slack to coordinate, to figure out what the status was and what still needed to be done and to just help do it. People who arrived a few minutes late to the scene didn’t jump into asking who’s “fault” it was, because that’s not how we roll, they just wanted to know what was going on and how they could help.
We got the site back to where it needed to be pretty quickly. Once that was done, some people went back to what they were doing before while others jumped right into figuring out things like, why did Chef upgrade the package on existing servers that’s never happened before that was weird (fun with apache modules and yum) or, why did the second Chef run fix things (our chef recipe deletes a default config file that ships with apache that broke with this particular version). The only person who had anything negative to say to me about the whole incident was me.
At some places, I would have been blamed, called out, even fired. At Etsy, I got a sweater.
We of course held a post-mortem to talk through what happened, what we learned from the event, and how we could make our systems more resilient to this sort of thing in the future. The SNAFUCatchers talked about it at their Brooklyn rendezvous in March.
To me, this kind of response to an incident feels not only productive but like a hallmark of organizational maturity. It can be a challenge if you’re starting with or coming from a “blame and train” or “blame and shame” culture to a “blameless” or “blame-aware” one, but the focus on desired outcome and how people can work together to help resolve a situation in the moment as well as make the systems involved better equipped to handle situations in the future, can do wonders for system and organizational resilience.