What Is Web Hosting, Exactly? Everything You Wanted To Know

I was contacted today by Richard Kershaw of WhoIsHostingThis.com who wrote to suggest a great web article called What Is Web Hosting, Exactly? Everything You Wanted To Know".

It is a very thorough introduction to the different types of web hosting that are available.

His web site has a lot of very useful resources. Everything from cookies to .htacces to pattern generators.

Another useful reference source.

I have recently shifted to working on the front end - of a headless project. And a crash course in CSS 3.0 and HTML5.

A lot has changed in 10 years. Back in 2009, it was XHTML, CSS 2.0 for me and no use of Javascript.

So I'm planning to learn along the way. Also checking out Web Accessibility requirements for screen readers etc.

Resources






Lessons from 300k+ Lines of Infrastructure Code

From QCon London
March 30 2019.

"Yevgeniy Brikman shares key lessons from the “Infrastructure Cookbook” they developed at Gruntwork while creating and maintaining a library of over 300,000 lines of infrastructure code used in production by hundreds of companies. Topics include how to design infrastructure APIs, automated tests for infrastructure code, patterns for reuse and composition, refactoring, namespacing, and more."

Mikes Notes




"Brikman: Thank you all for coming. This is a talk, as Jonas mentioned, of the ugly layer beneath your microservices, all the infrastructure under the hood that it's going to take to make them work and some of the lessons we learned. There's not a great term for this. I'm just going to use the word DevOps, although it's not super well-defined. And one of the things I want to share is a confession about DevOps to start the talk off.

Hey, there it is. There's a limited range on this thing. So here's the confession - the DevOps industry is very much in the stone ages, and I don't say that to be mean or to insult anybody. I just mean, literally, we are still new to this. We have only been doing DevOps, at least as a term, for a few years. Still figuring out how to do it. But what's scary about that is we're being asked to build things that are very modern. We're being asked to put together these amazing, cutting edge infrastructures, but I feel like the tooling we're using to do it looks something like that.

Now, you wouldn't know that if all you did was read blog posts, read the headlines, everything sounds really cutting edge. Half the talks here, half the blog posts out there, they're going to be about, oh my God, Kurbernetes, and Docker, and microservices, and service meshes, and all these unbelievable things that sound really cutting edge. But for me as a developer, on a day to day basis, it doesn't feel quite so cutting edge, right? That's not what my day-to-day experience feels like. My day-to-day experience with DevOps feels a little more like that. You're cooking pizza on an iron with a blow dryer. This is your hashtag for this talk, #thisisdevops. This is what it feels like.

..."

Gremlin Announces Free Tier for Their Chaos Experimentation Platform

From InfoQ

"The Gremlin team has announced "Gremlin Free", which provides the ability to run chaos engineering experiments on a free tier of their failure-as-a-service SaaS platform. The current version of the free tier allows the execution of shutdown and CPU attacks on a host or container, which can be controlled via a simple web-based user interface, API or CLI.

At the end of 2017 the Gremlin team announced the first release of their chaos experimentation SaaS product that supported the coordination of multiple attacks on hosts and the associated infrastructure. In 2018 application-level failure injection (ALFI) was also added, which supported running attacks on individual application services or functions. One of the primary attacks throughout the evolution of the product has been the shutdown of instances, which was partly inspired by the Netflix Chaos Monkey -- one of the first chaos engineering tools within the cloud computing domain.

The Gremlin team has argued that although the Chaos Monkey tool is useful, it does require time to learn how to safely operate. The original tool also only supported AWS (although additional tooling has emerged that offers similar instance shutdown abilities within Azure and Google Cloud Platform). With the launch of Gremlin Free, the Gremlin team is aiming to reduce these barriers to running chaos experiments, and facilitate teams in quickly seeing the value from doing so.

For engineers looking to explore the new free tier, Tammy Butow, principal SRE at Gremlin, has created a "Shutdown Experiment Pack" that is available on the Gremlin website. This provides a detailed walkthrough for running five chaos experiments that shutdown cloud infrastructure hosts and containers on AWS, Azure, and GCP (for which cloud vendor accounts are required), and also shutdown containers running locally with Docker.

InfoQ recently sat down with Lorne Kligerman, director of product, and discussed the motivations and future plans with Gremlin Free."

And you can read the rest of the interview here.

Mike Notes

  • Roll this into DevOps.

Otherworldly Java: Gateway to the Moon and beyond

A recent talk at QCon London by Diane Craig Davis really got me intrigued.

The talk is well worth watching.

"Diane Craig Davis is an astrodynamicist and principal systems engineer with NASA and USAF aerospace industry leader AI solutions. She designs spacecraft orbits with the Gateway trajectory team at Johnson Space Center in Houston, TX, and previously navigated spacecraft to Mars and comets at the Jet Propulsion Laboratory. She is the lead researcher for the Deep Space Trajectory Explorer." (from QCon)

The talk was about Deep Space Trajectory Explorer, a JavaFX-based trajectory design and visualization software package that features a mix of custom 2D and 3D visualizations.

Astrodynamicist Diane Craig Davis worked with JavaFX developer Sean Phillips to find a way to visualise possible space trajectories of 3 body systems like the Earth, Moon and a space craft.

3 body systems are inherently chaotic.

The closest approach in an orbit is called a periapsis point. A collection of potential periapsis points is called a Poincaré map. Deep Space Trajectory Explorer enables a scientist to quickly and simply see a million possible orbits using a Poincaré map, and then whittle them down to find the best path for a trajectory.

Here are another 2 videos by Diane Craig Davis and Sean Phillips found on YouTube.






Sean Phillips also gave a great 2 hour talk and demo at Java User Group Dahlgren on JavaFX and NetBeans IDE.



Sean also created JavaFX 3D Library available on GitHub

"My new 3D library for JavaFX 8. F(X)yz provides useful primitives and more complex data visualization components that are unavailable in the base JavaFX 3D package."

Mikes Notes

  • JavaFX improves on Java Swing by requiring a lot less code to get things done.
  • Canvas enables on-the-fly rendering of millions of points for real-time animation.
  • Sean Phillips now works at John Hopkins.
  • NetBeans IDE looks great (not too cluttered).
  • I have only used Eclipse and Visual Studio to work with Java. Now I think I will switch to NetBeans.
  • Use Eclipse for XML technology.

Quanta Writers and Editors Discuss Trends in Science and Math

Mikes Notes

This is 3 part of a series of 3 posts that reprint articles that cover the crisis in particle physics that have emerged since the failure of scientists at CERN to discover a whole range of new particles as predicted by string theorists.

In my opinion, string theory is complete bollocks as is multi-universes. Nature is a book to be read not written. Maths is largely an invention rather than discovery. Theory needs to be tested and proven by experimentation.

Below is an article about a panel discussion which takes up many of the issues confronting particle physics, the nature of mathematics etc.

The video of the panel discussion is well worth watching. Natalie Wolchover in her panel comments discusses the issues raised at the workshop she previously wrote about in A Fight for the Soul of Science.


By Thomas Lin
Editor in Chief Quanta Magazine
December 16, 2015.
Reprinted from Quanta Magazine

On November 16, more than 200 readers joined writers and editors from Quanta Magazine for a panel discussion exploring the latest ideas in fundamental physics, biology and mathematics research.




From left: Thomas Lin, Natalie Wolchover, John Rennie, Kevin Hartnett and Robbert Dijkgraaf.







Why doesn’t our universe make sense? What is time? What is life? On Friday, more than 200 readers joined writers and editors from Quanta Magazine at the Simons Foundation for a wide-ranging panel discussion that examined the newest ideas in fundamental physics, biology and mathematics research, including the questions of whether our universe is “natural,” the nature of time, the origin and evolution of life, whether mathematics is invented or discovered and what role it plays in science and society. These are just some of the topics presented in Quanta’s two new books published by The MIT Press: Alice and Bob Meet the Wall of Fire and The Prime Number Conspiracy.




PANELISTS
  • Robbert Dijkgraaf, director of the Institute for Advanced Study
  • Kevin Hartnett, senior writer at Quanta Magazine
  • John Rennie, deputy editor at Quanta Magazine
  • Natalie Wolchover, senior writer and editor at Quanta Magazine

MODERATOR
  • Thomas Lin, editor in chief at Quanta Magazine

A Fight for the Soul of Science

Mikes Notes

This is 2 part of a series of 3 posts that reprint articles that cover the crisis in particle physics that have emerged since the failure of scientists at CERN to discover a whole range of new particles as predicted by string theorists.

In my opinion, string theory is complete bollocks as is multi-universes. Nature is a book to be read not written. Maths is largely an invention rather than discovery. Theory needs to be tested and proven by experimentation.

The article below is a report from a 3-day workshop in Germany. The workshop was held in response to the article in Nature, Scientific Method: Defend the integrity of physics.



By Natalie Wolchover
Senior Writer/Editor
Quanta Magazine
December 16, 2015.

Reprinted from Quanta Magazine

String theory, the multiverse and other ideas of modern physics are potentially untestable. At a historic meeting in Munich, scientists and philosophers asked: should we trust them anyway?

Physicists typically think they “need philosophers and historians of science like birds need ornithologists,” the Nobel laureate David Gross told a roomful of philosophers, historians and physicists last week in Munich, Germany, paraphrasing Richard Feynman.


Physicists George Ellis (center) and Joe Silk (right) at Ludwig Maximilian University in Munich on Dec. 7.

Laetitia Vancon for Quanta Magazine






But desperate times call for desperate measures.

Fundamental physics faces a problem, Gross explained — one dire enough to call for outsiders’ perspectives. “I’m not sure that we don’t need each other at this point in time,” he said.

It was the opening session of a three-day workshop, held in a Romanesque-style lecture hall at Ludwig Maximilian University (LMU Munich) one year after George Ellis and Joe Silk, two white-haired physicists now sitting in the front row, called for such a conference in an incendiary opinion piece in Nature. One hundred attendees had descended on a land with a celebrated tradition in both physics and the philosophy of science to wage what Ellis and Silk declared a “battle for the heart and soul of physics.”

The crisis, as Ellis and Silk tell it, is the wildly speculative nature of modern physics theories, which they say reflects a dangerous departure from the scientific method. Many of today’s theorists — chief among them the proponents of string theory and the multiverse hypothesis — appear convinced of their ideas on the grounds that they are beautiful or logically compelling, despite the impossibility of testing them. Ellis and Silk accused these theorists of “moving the goalposts” of science and blurring the line between physics and pseudoscience. “The imprimatur of science should be awarded only to a theory that is testable,” Ellis and Silk wrote, thereby disqualifying most of the leading theories of the past 40 years. “Only then can we defend science from attack.”

They were reacting, in part, to the controversial ideas of Richard Dawid, an Austrian philosopher whose 2013 book String Theory and the Scientific Method identified three kinds of “non-empirical” evidence that Dawid says can help build trust in scientific theories absent empirical data. Dawid, a researcher at LMU Munich, answered Ellis and Silk’s battle cry and assembled far-flung scholars anchoring all sides of the argument for the high-profile event last week.


David Gross, a theoretical physicist at the University of California, Santa Barbara.

Laetitia Vancon for Quanta Magazine









Gross, a supporter of string theory who won the 2004 Nobel Prize in Physics for his work on the force that glues atoms together, kicked off the workshop by asserting that the problem lies not with physicists but with a “fact of nature” — one that we have been approaching inevitably for four centuries.

The dogged pursuit of a fundamental theory governing all forces of nature requires physicists to inspect the universe more and more closely — to examine, for instance, the atoms within matter, the protons and neutrons within those atoms, and the quarks within those protons and neutrons. But this zooming in demands evermore energy, and the difficulty and cost of building new machines increases exponentially relative to the energy requirement, Gross said. “It hasn’t been a problem so much for the last 400 years, where we’ve gone from centimeters to millionths of a millionth of a millionth of a centimeter” — the current resolving power of the Large Hadron Collider (LHC) in Switzerland, he said. “We’ve gone very far, but this energy-squared is killing us.”

As we approach the practical limits of our ability to probe nature’s underlying principles, the minds of theorists have wandered far beyond the tiniest observable distances and highest possible energies. Strong clues indicate that the truly fundamental constituents of the universe lie at a distance scale 10 million billion times smaller than the resolving power of the LHC. This is the domain of nature that string theory, a candidate “theory of everything,” attempts to describe. But it’s a domain that no one has the faintest idea how to access.

The problem also hampers physicists’ quest to understand the universe on a cosmic scale: No telescope will ever manage to peer past our universe’s cosmic horizon and glimpse the other universes posited by the multiverse hypothesis. Yet modern theories of cosmology lead logically to the possibility that our universe is just one of many.



Tynan DeBold for Quanta Magazine; Icons via Freepik










Whether the fault lies with theorists for getting carried away, or with nature, for burying its best secrets, the conclusion is the same: Theory has detached itself from experiment. The objects of theoretical speculation are now too far away, too small, too energetic or too far in the past to reach or rule out with our earthly instruments. So, what is to be done? As Ellis and Silk wrote, “Physicists, philosophers and other scientists should hammer out a new narrative for the scientific method that can deal with the scope of modern physics.”

“The issue in confronting the next step,” said Gross, “is not one of ideology but strategy: What is the most useful way of doing science?”

Over three mild winter days, scholars grappled with the meaning of theory, confirmation and truth; how science works; and whether, in this day and age, philosophy should guide research in physics or the other way around. Over the course of these pressing yet timeless discussions, a degree of consensus took shape.

Rules of the Game
Throughout history, the rules of science have been written on the fly, only to be revised to fit evolving circumstances. The ancients believed they could reason their way toward scientific truth. Then, in the 17th century, Isaac Newton ignited modern science by breaking with this “rationalist” philosophy, adopting instead the “empiricist” view that scientific knowledge derives only from empirical observation. In other words, a theory must be proved experimentally to enter the book of knowledge.

But what requirements must an untested theory meet to be considered scientific? Theorists guide the scientific enterprise by dreaming up the ideas to be put to the test and then interpreting the experimental results; what keeps theorists within the bounds of science?

Today, most physicists judge the soundness of a theory by using the Austrian-British philosopher Karl Popper’s rule of thumb. In the 1930s, Popper drew a line between science and nonscience in comparing the work of Albert Einstein with that of Sigmund Freud. Einstein’s theory of general relativity, which cast the force of gravity as curves in space and time, made risky predictions — ones that, if they hadn’t succeeded so brilliantly, would have failed miserably, falsifying the theory. But Freudian psychoanalysis was slippery: Any fault of your mother’s could be worked into your diagnosis. The theory wasn’t falsifiable, and so, Popper decided, it wasn’t science.


Paul Teller (by window), a philosopher and professor emeritus at the University of California, Davis.

Laetitia Vancon for Quanta Magazine



























Critics accuse string theory and the multiverse hypothesis, as well as cosmic inflation — the leading theory of how the universe began — of falling on the wrong side of Popper’s line of demarcation. To borrow the title of the Columbia University physicist Peter Woit’s 2006 book on string theory, these ideas are “not even wrong,” say critics. In their editorial, Ellis and Silk invoked the spirit of Popper: “A theory must be falsifiable to be scientific.”

But, as many in Munich were surprised to learn, falsificationism is no longer the reigning philosophy of science. Massimo Pigliucci, a philosopher at the Graduate Center of the City University of New York, pointed out that falsifiability is woefully inadequate as a separator of science and nonscience, as Popper himself recognized. Astrology, for instance, is falsifiable — indeed, it has been falsified ad nauseam — and yet it isn’t science. Physicists’ preoccupation with Popper “is really something that needs to stop,” Pigliucci said. “We need to talk about current philosophy of science. We don’t talk about something that was current 50 years ago.”

Nowadays, as several philosophers at the workshop said, Popperian falsificationism has been supplanted by Bayesian confirmation theory, or Bayesianism, a modern framework based on the 18th-century probability theory of the English statistician and minister Thomas Bayes. Bayesianism allows for the fact that modern scientific theories typically make claims far beyond what can be directly observed — no one has ever seen an atom — and so today’s theories often resist a falsified-unfalsified dichotomy. Instead, trust in a theory often falls somewhere along a continuum, sliding up or down between 0 and 100 percent as new information becomes available. “The Bayesian framework is much more flexible” than Popper’s theory, said Stephan Hartmann, a Bayesian philosopher at LMU. “It also connects nicely to the psychology of reasoning.”

Gross concurred, saying that, upon learning about Bayesian confirmation theory from Dawid’s book, he felt “somewhat like the Molière character who said, ‘Oh my God, I’ve been talking prose all my life!’”

Another advantage of Bayesianism, Hartmann said, is that it is enabling philosophers like Dawid to figure out “how this non-empirical evidence fits in, or can be fit in.”

Another Kind of Evidence
Dawid, who is 49, mild-mannered and smiley with floppy brown hair, started his career as a theoretical physicist. In the late 1990s, during a stint at the University of California, Berkeley, a hub of string-theory research, Dawid became fascinated by how confident many string theorists seemed to be that they were on the right track, despite string theory’s complete lack of empirical support. “Why do they trust the theory?” he recalls wondering. “Do they have different ways of thinking about it than the canonical understanding?”

String theory says that elementary particles have dimensionality when viewed close-up, appearing as wiggling loops (or “strings”) and membranes at nature’s highest zoom level. According to the theory, extra dimensions also materialize in the fabric of space itself. The different vibrational modes of the strings in this higher-dimensional space give rise to the spectrum of particles that make up the observable world. In particular, one of the vibrational modes fits the profile of the “graviton” — the hypothetical particle associated with the force of gravity. Thus, string theory unifies gravity, now described by Einstein’s theory of general relativity, with the rest of particle physics.



Video: Richard Dawid, a physicist-turned-philosopher at Ludwig Maximilian University in Munich.


Laetitia Vancon for Quanta Magazine










However string theory, which has its roots in ideas developed in the late 1960s, has made no testable predictions about the observable universe. To understand why so many researchers trust it anyway, Dawid signed up for some classes in philosophy of science, and upon discovering how little study had been devoted to the phenomenon, he switched fields.

In the early 2000s, he identified three non-empirical arguments that generate trust in string theory among its proponents. First, there appears to be only one version of string theory capable of achieving unification in a consistent way (though it has many different mathematical representations); furthermore, no other “theory of everything” capable of unifying all the fundamental forces has been found, despite immense effort. (A rival approach called loop quantum gravity describes gravity at the quantum scale, but makes no attempt to unify it with the other forces.) This “no-alternatives” argument, colloquially known as “string theory is the only game in town,” boosts theorists’ confidence that few or no other possible unifications of the four fundamental forces exist, making it more likely that string theory is the right approach.

Second, string theory grew out of the Standard Model — the accepted, empirically validated theory incorporating all known fundamental particles and forces (apart from gravity) in a single mathematical structure — and the Standard Model also had no alternatives during its formative years. This “meta-inductive” argument, as Dawid calls it, buttresses the no-alternatives argument by showing that it has worked before in similar contexts, countering the possibility that physicists simply aren’t clever enough to find the alternatives that exist.



Emily Fuhrman for Quanta Magazine, with text by Natalie Wolchover and art direction by Olena Shmahalo.

The third non-empirical argument is that string theory has unexpectedly delivered explanations for several other theoretical problems aside from the unification problem it was intended to address. The staunch string theorist Joe Polchinski of the University of California, Santa Barbara, presented several examples of these “unexpected explanatory interconnections,” as Dawid has termed them, in a paper read in Munich in his absence. String theory explains the entropy of black holes, for example, and, in a surprising discovery that has caused a surge of research in the past 15 years, is mathematically translatable into a theory of particles, such as the theory describing the nuclei of atoms.

Polchinski concludes that, considering how far away we are from the exceptionally fine grain of nature’s fundamental distance scale, we should count ourselves lucky: “String theory exists, and we have found it.” (Polchinski also used Dawid’s non-empirical arguments to calculate the Bayesian odds that the multiverse exists as 94 percent — a value that has been ridiculed by the Internet’s vocal multiverse critics.)

One concern with including non-empirical arguments in Bayesian confirmation theory, Dawid acknowledged in his talk, is “that it opens the floodgates to abandoning all scientific principles.” One can come up with all kinds of non-empirical virtues when arguing in favor of a pet idea. “Clearly the risk is there, and clearly one has to be careful about this kind of reasoning,” Dawid said. “But acknowledging that non-empirical confirmation is part of science, and has been part of science for quite some time, provides a better basis for having that discussion than pretending that it wasn’t there, and only implicitly using it, and then saying I haven’t done it. Once it’s out in the open, one can discuss the pros and cons of those arguments within a specific context.”

The Munich Debate

Laetitia Vancon for Quanta Magazine











The trash heap of history is littered with beautiful theories. The Danish historian of cosmology Helge Kragh, who detailed a number of these failures in his 2011 book, Higher Speculations, spoke in Munich about the 19th-century vortex theory of atoms. This “Victorian theory of everything,” developed by the Scots Peter Tait and Lord Kelvin, postulated that atoms are microscopic vortexes in the ether, the fluid medium that was believed at the time to fill space. Hydrogen, oxygen and all other atoms were, deep down, just different types of vortical knots. At first, the theory “seemed to be highly promising,” Kragh said. “People were fascinated by the richness of the mathematics, which could keep mathematicians busy for centuries, as was said at the time.” Alas, atoms are not vortexes, the ether does not exist, and theoretical beauty is not always truth.
Except sometimes it is. Rationalism guided Einstein toward his theory of relativity, which he believed in wholeheartedly on rational grounds before it was ever tested. “I hold it true that pure thought can grasp reality, as the ancients dreamed,” Einstein said in 1933, years after his theory had been confirmed by observations of starlight bending around the sun.

The question for the philosophers is: Without experiments, is there any way to distinguish between the non-empirical virtues of vortex theory and those of Einstein’s theory? Can we ever really trust a theory on non-empirical grounds?

In discussions on the third afternoon of the workshop, the LMU philosopher Radin Dardashti asserted that Dawid’s philosophy specifically aims to pinpoint which non-empirical arguments should carry weight, allowing scientists to “make an assessment that is not based on simplicity, which is not based on beauty.” Dawidian assessment is meant to be more objective than these measures, Dardashti explained — and more revealing of a theory’s true promise.

Gross said Dawid has “described beautifully” the strategies physicists use “to gain confidence in a speculation, a new idea, a new theory.”

“You mean confidence that it’s true?” asked Peter Achinstein, an 80-year-old philosopher and historian of science at Johns Hopkins University. “Confidence that it’s useful? confidence that …”

“Let’s give an operational definition of confidence: I will continue to work on it,” Gross said.

“That’s pretty low,” Achinstein said.

“Not for science,” Gross said. “That’s the question that matters.”

Kragh pointed out that even Popper saw value in the kind of thinking that motivates string theorists today. Popper called speculation that did not yield testable predictions “metaphysics,” but he considered such activity worthwhile, since it might become testable in the future. This was true of atomic theory, which many 19th-century physicists feared would never be empirically confirmed. “Popper was not a naive Popperian,” Kragh said. “If a theory is not falsifiable,” Kragh said, channeling Popper, “it should not be given up. We have to wait.”

But several workshop participants raised qualms about Bayesian confirmation theory, and about Dawid’s non-empirical arguments in particular.

Carlo Rovelli, a proponent of loop quantum gravity (string theory’s rival) who is based at Aix-Marseille University in France, objected that Bayesian confirmation theory does not allow for an important distinction that exists in science between theories that scientists are certain about and those that are still being tested. The Bayesian “confirmation” that atoms exist is essentially 100 percent, as a result of countless experiments. But Rovelli says that the degree of confirmation of atomic theory shouldn’t even be measured in the same units as that of string theory. String theory is not, say, 10 percent as confirmed as atomic theory; the two have different statuses entirely. “The problem with Dawid’s ‘non-empirical confirmation’ is that it muddles the point,” Rovelli said. “And of course some string theorists are happy of muddling it this way, because they can then say that string theory is ‘confirmed,’ equivocating.”

The German physicist Sabine Hossenfelder, in her talk, argued that progress in fundamental physics very often comes from abandoning cherished prejudices (such as, perhaps, the assumption that the forces of nature must be unified). Echoing this point, Rovelli said “Dawid’s idea of non-empirical confirmation [forms] an obstacle to this possibility of progress, because it bases our credence on our own previous credences.” It “takes away one of the tools — maybe the soul itself — of scientific thinking,” he continued, “which is ‘do not trust your own thinking.’”

The Munich proceedings will be compiled and published, probably as a book, in 2017. As for what was accomplished, one important outcome, according to Ellis, was an acknowledgment by participating string theorists that the theory is not “confirmed” in the sense of being verified. “David Gross made his position clear: Dawid’s criteria are good for justifying working on the theory, not for saying the theory is validated in a non-empirical way,” Ellis wrote in an email. “That seems to me a good position — and explicitly stating that is progress.”

In considering how theorists should proceed, many attendees expressed the view that work on string theory and other as-yet-untestable ideas should continue. “Keep speculating,” Achinstein wrote in an email after the workshop, but “give your motivation for speculating, give your explanations, but admit that they are only possible explanations.”

“Maybe someday things will change,” Achinstein added, “and the speculations will become testable; and maybe not, maybe never.” We may never know for sure the way the universe works at all distances and all times, “but perhaps you can narrow the live possibilities to just a few,” he said. “I think that would be some progress.”

Reprinted from The Atlantic

Scientific method: Defend the integrity of physics

Mikes Notes

This is part 1 of a series of 3 posts that reprint articles that cover the crisis in particle physics that have emerged since the failure of scientists at CERN to discover a whole range of new particles as predicted by string theorists.


In my opinion, string theory is complete bollocks as is multi-universes. Nature is a book to be read not written. Maths is largely an invention rather than discovery. Theory needs to be tested and proven by experimentation.

The article below is a stirring defence of science. I agree with it completely.

By George Ellis & Joe Silk
16 December 2014

Reprinted from Nature
Vol 516. Issue 7531

Attempts to exempt speculative theories of the Universe from experimental verification undermine science, argue George Ellis and Joe Silk.

This year, debates in physics circles took a worrying turn. Faced with difficulties in applying fundamental theories to the observed Universe, some researchers called for a change in how theoretical physics is done. They began to argue — explicitly — that if a theory is sufficiently elegant and explanatory, it need not be tested experimentally, breaking with centuries of philosophical tradition of defining scientific knowledge as empirical. We disagree. As the philosopher of science Karl Popper argued: a theory must be falsifiable to be scientific.

Chief among the 'elegance will suffice' advocates are some string theorists. Because string theory is supposedly the 'only game in town' capable of unifying the four fundamental forces, they believe that it must contain a grain of truth even though it relies on extra dimensions that we can never observe. Some cosmologists, too, are seeking to abandon experimental verification of grand hypotheses that invoke imperceptible domains such as the kaleidoscopic multiverse (comprising myriad universes), the 'many worlds' version of quantum reality (in which observations spawn parallel branches of reality) and pre-Big Bang concepts.

These unprovable hypotheses are quite different from those that relate directly to the real world and that are testable through observations — such as the standard model of particle physics and the existence of dark matter and dark energy. As we see it, theoretical physics risks becoming a no-man's-land between mathematics, physics and philosophy that does not truly meet the requirements of any.

The issue of testability has been lurking for a decade. String theory and multiverse theory have been criticized in popular books and articles, including some by one of us. In March, theorist Paul Steinhardt wrote in this journal that the theory of inflationary cosmology is no longer scientific because it is so flexible that it can accommodate any observational result. Theorist and philosopher Richard Dawid and cosmologist Sean Carroll have countered those criticisms with a philosophical case to weaken the testability requirement for fundamental physics.

We applaud the fact that Dawid, Carroll and other physicists have brought the problem out into the open. But the drastic step that they are advocating needs careful debate. This battle for the heart and soul of physics is opening up at a time when scientific results — in topics from climate change to the theory of evolution — are being questioned by some politicians and religious fundamentalists. Potential damage to public confidence in science and to the nature of fundamental physics needs to be contained by deeper dialogue between scientists and philosophers.

String theory

String theory is an elaborate proposal for how minuscule strings (one-dimensional space entities) and membranes (higher-dimensional extensions) existing in higher-dimensional spaces underlie all of physics. The higher dimensions are wound so tightly that they are too small to observe at energies accessible through collisions in any practicable future particle detector.

Some aspects of string theory can be tested experimentally in principle. For example, a hypothesized symmetry between fermions and bosons central to string theory — supersymmetry — predicts that each kind of particle has an as-yet-unseen partner. No such partners have yet been detected by the Large Hadron Collider at CERN, Europe's particle-physics laboratory near Geneva, Switzerland, limiting the range of energies at which supersymmetry might exist. If these partners continue to elude detection, then we may never know whether they exist. Proponents could always claim that the particles' masses are higher than the energies probed.

“The consequences of overclaiming the significance of certain theories are profound.”

Dawid argues6 that the veracity of string theory can be established through philosophical and probabilistic arguments about the research process. Citing Bayesian analysis, a statistical method for inferring the likelihood that an explanation fits a set of facts, Dawid equates confirmation with the increase of the probability that a theory is true or viable. But that increase of probability can be purely theoretical. Because “no-one has found a good alternative” and “theories without alternatives tended to be viable in the past”, he reasons that string theory should be taken to be valid.

In our opinion, this is moving the goalposts. Instead of belief in a scientific theory increasing when observational evidence arises to support it, he suggests that theoretical discoveries bolster belief. But conclusions arising logically from mathematics need not apply to the real world. Experiments have proved many beautiful and simple theories wrong, from the steady-state theory of cosmology to the SU(5) Grand Unified Theory of particle physics, which aimed to unify the electroweak force and the strong force. The idea that preconceived truths about the world can be inferred beyond established facts (inductivism) was overturned by Popper and other twentieth-century philosophers.

We cannot know that there are no alternative theories. We may not have found them yet. Or the premise might be wrong. There may be no need for an overarching theory of four fundamental forces and particles if gravity, an effect of space-time curvature, differs from the strong, weak and electromagnetic forces that govern particles. And with its many variants, string theory is not even well defined: in our view, it is a promissory note that there might be such a unified theory.

Many multiverses

The multiverse is motivated by a puzzle: why fundamental constants of nature, such as the fine-structure constant that characterizes the strength of electromagnetic interactions between particles and the cosmological constant associated with the acceleration of the expansion of the Universe, have values that lie in the small range that allows life to exist. Multiverse theory claims that there are billions of unobservable sister universes out there in which all possible values of these constants can occur. So somewhere there will be a bio-friendly universe like ours, however improbable that is.

Some physicists consider that the multiverse has no challenger as an explanation of many otherwise bizarre coincidences. The low value of the cosmological constant — known to be 120 factors of 10 smaller than the value predicted by quantum field theory — is difficult to explain, for instance.

Earlier this year, championing the multiverse and the many-worlds hypothesis, Carroll dismissed Popper's falsifiability criterion as a “blunt instrument” (see go.nature.com/nuj39z). He offered two other requirements: a scientific theory should be “definite” and “empirical”. By definite, Carroll means that the theory says “something clear and unambiguous about how reality functions”. By empirical, he agrees with the customary definition that a theory should be judged a success or failure by its ability to explain the data.

He argues that inaccessible domains can have a “dramatic effect” in our cosmic back-yard, explaining why the cosmological constant is so small in the part we see. But in multiverse theory, that explanation could be given no matter what astronomers observe. All possible combinations of cosmological parameters would exist somewhere, and the theory has many variables that can be tweaked. Other theories, such as unimodular gravity, a modified version of Einstein's general theory of relativity, can also explain why the cosmological constant is not huge7.

Some people have devised forms of multiverse theory that are susceptible to tests: physicist Leonard Susskind's version can be falsified if negative spatial curvature of the Universe is ever demonstrated. But such a finding would prove nothing about the many other versions. Fundamentally, the multiverse explanation relies on string theory, which is as yet unverified, and on speculative mechanisms for realizing different physics in different sister universes. It is not, in our opinion, robust, let alone testable.

The many-worlds theory of quantum reality posed by physicist Hugh Everett is the ultimate quantum multiverse, where quantum probabilities affect the macroscopic. According to Everett, each of Schrödinger's famous cats, the dead and the live, poisoned or not in its closed box by random radioactive decays, is real in its own universe. Each time you make a choice, even one as mundane as whether to go left or right, an alternative universe pops out of the quantum vacuum to accommodate the other action.

Billions of universes — and of galaxies and copies of each of us — accumulate with no possibility of communication between them or of testing their reality. But if a duplicate self exists in every multiverse domain and there are infinitely many, which is the real 'me' that I experience now? Is any version of oneself preferred over any other? How could 'I' ever know what the 'true' nature of reality is if one self favours the multiverse and another does not?

In our view, cosmologists should heed mathematician David Hilbert's warning: although infinity is needed to complete mathematics, it occurs nowhere in the physical Universe.

Pass the test
We agree with theoretical physicist Sabine Hossenfelder: post-empirical science is an oxymoron (see go.nature.com/p3upwp and go.nature.com/68rijj). Theories such as quantum mechanics and relativity turned out well because they made predictions that survived testing. Yet numerous historical examples point to how, in the absence of adequate data, elegant and compelling ideas led researchers in the wrong direction, from Ptolemy's geocentric theories of the cosmos to Lord Kelvin's 'vortex theory' of the atom and Fred Hoyle's perpetual steady-state Universe.

The consequences of overclaiming the significance of certain theories are profound — the scientific method is at stake (see go.nature.com/hh7mm6). To state that a theory is so good that its existence supplants the need for data and testing in our opinion risks misleading students and the public as to how science should be done and could open the door for pseudoscientists to claim that their ideas meet similar requirements.

What to do about it? Physicists, philosophers and other scientists should hammer out a new narrative for the scientific method that can deal with the scope of modern physics. In our view, the issue boils down to clarifying one question: what potential observational or experimental evidence is there that would persuade you that the theory is wrong and lead you to abandoning it? If there is none, it is not a scientific theory.

Such a case must be made in formal philosophical terms. A conference should be convened next year to take the first steps. People from both sides of the testability debate must be involved.

In the meantime, journal editors and publishers could assign speculative work to other research categories — such as mathematical rather than physical cosmology — according to its potential testability. And the domination of some physics departments and institutes by such activities could be rethought.

The imprimatur of science should be awarded only to a theory that is testable. Only then can we defend science from attack.

Unofficial Windows Binaries for Python Extension Packages

I found a great list of Unofficial Windows Binaries for Python Extension Packages maintained by Christoph Gohlke Laboratory for Fluorescence Dynamics, University of California, Irvine.

He writes ...

"This page provides 32- and 64-bit Windows binaries of many scientific open-source extension packages for the official CPython distribution of the Python programming language. A few binaries are available for the PyPy distribution.
The files are unofficial (meaning: informal, unrecognized, personal, unsupported, no warranty, no liability, provided "as is") and made available for testing and evaluation purposes.
Most binaries are built from source code found on PyPI or in the projects public revision control systems. Source code changes, if any, have been submitted to the project maintainers or are included in the packages.
Refer to the documentation of the individual packages for license restrictions and dependencies.
If downloads fail, reload this page, enable JavaScript, disable download managers, disable proxies, clear cache, use Firefox, reduce number and frequency of downloads. Please only download files manually as needed.
Use pip version 9 or newer to install the downloaded .whl files. This page is not a pip package index.
Many binaries depend on numpy-1.15+mkl and the Microsoft Visual C++ 2008 (x64x86, and SP1 for Python 2.7) or the Visual C++ 2017 (x64 or x86 for Python 3.x) redistributable packages.
Install numpy+mkl before other packages that depend on it.
The binaries are compatible with the most recent official CPython distributions on Windows >=6.0. Chances are they do not work with custom Python distributions included with Blender, Maya, ArcGIS, OSGeo4W, ABAQUS, Cygwin, Pythonxy, Canopy, EPD, Anaconda, WinPython etc. Many binaries are not compatible with Windows XP or Wine.
The packages are ZIP or 7z files, which allows for manual or scripted installation or repackaging of the content.
The files are provided "as is" without warranty or support of any kind. The entire risk as to the quality and performance is with you.
The opinions or statements expressed on this page should not be taken as a position or endorsement of the Laboratory for Fluorescence Dynamics or the University of California."

Mikes Notes

  • There must be hundreds of binaries listed

Installing TensorFlow on Windows

TensorFlow is the most widely used Machine Learning (ML) software today.

It was developed by the Google Brain team and then released under Apache license in 2015.

Official website www.tensorflow.org

It can be run in the cloud on many platforms including Google. It can also be downloaded and installed on mobile phones, desktop computers and local servers.

Windows Install

  • Very good step by step instructions here.
  • Also here.

ColdFusion and AJAX

While getting up to speed on using AJAX, I wondered how it would work with a ColdFusion Server. I spent many years developing in CFML some time back.

I have a lot to learn as I build a front end in AJAX.

Here are some useful links from a quick Google search.


Mike Notes

  • What is most secure and robust?

Fourier Series 101

In mathematics, a Fourier series is a periodic function composed of harmonically related sinusoids, combined by a weighted summation. With appropriate weights, one cycle (or period) of the summation can be made to approximate an arbitrary function in that interval (or the entire function if it too is periodic). As such, the summation is a synthesis of another function.

... from Wikipedia.

So what does that mean in everyday English?

Mathematicians do have a tendency to make things much more complex than they are.

I found this wonderful explanation of what is a Fourier Series on YouTube recently. It's well worth a watch.



So now the explanation at Wikipedia starts to make some sense.

and Mathworld at Wolfram had this to say which also now starts to make sense

Encyclopedia of Math was not helpful at all at least to me.

Just for fun from Wolfram Demonstration. Just play with the buttons and sliders.

Fourier Synthesis for Selected Waveforms by Kenny F Stephens.
"Combine up to nine harmonic frequencies to visualize the resulting waveform using Fourier synthesis. Several standard waveforms are provided (sine, square, sawtooth, and triangle)."


Mikes Notes

  • Video big picture the best approach to learning new maths
  • Use Wolfram CDF
  • Jupyter?

News to build on: 122+ announcements from Google Cloud Next ‘19

From Google Cloud blog announcement April 13 2019
By Alison Wagonfeld

"We hope you enjoyed Next ’19 as much as we did! The past few days brought our Google Cloud community together to learn about lots of new technologies and see how customers and partners are pushing their ideas and businesses forward with the cloud. It was a lot to digest, but we’ve boiled it down here into all the announcements from the week across infrastructure, application development, data management, smart analytics and AI, productivity, partnerships, and more.

Infrastructure
1. We announced two new regions in Seoul, South Korea and Salt Lake City, Utah to expand our global footprint and to support our growing customers around the world.

Hybrid Cloud
2. Anthos (the new name for Cloud Services Platform) is now generally available on Google Kubernetes Engine (GKE) and GKE On-Prem, so you can deploy, run and manage your applications on-premises or in the cloud. Coming soon, we’ll extend that flexibility to third-party clouds like AWS and Azure. And Anthos is launching with the support of more than 30 hardware, software and system integration partners so you can get up and running fast.

3. With Anthos Migrate, powered by Velostrata’s migration technology, you can auto-migrate VMs from on-premises or other clouds directly into containers in GKE with minimal effort.

4. Anthos Config Management lets you create multi-cluster policies out of the box that set and enforce role-based access controls, resource quotas, and namespaces—all from a single source of truth.

Serverless
5. Cloud Run, our fully managed serverless execution environment, offers serverless agility for containerized apps.

6. Cloud Run on GKE brings the serverless developer experience and workload portability to your GKE cluster.

7. Knative, the open API and runtime environment, brings a serverless developer experience and workload portability to your existing Kubernetes cluster anywhere.

8. We’re also making new investments in our Cloud Functions and App Engine platforms with new second generation runtimes, a new open-sourced Functions Framework, and additional core capabilities, including connectivity to private GCP resources.

DevOps/SRE
9. The new Cloud Code makes it easy to develop and deploy cloud-native applications on Kubernetes, by extending your favorite local Integrated Development Environments (IDE) IntelliJ and Visual Studio Code.

API Management
10. Apigee hybrid (beta) is a new deployment option for the Apigee API management platform that lets you host your runtime anywhere—in your data center or the public cloud of your choice.

11. Apigee security reporting (beta) offers visibility into the security status of your APIs.

12. Now you can consume a variety of Google Cloud services directly from the Apigee API Management platform, including Cloud Functions (secured by IAM), Cloud Data Loss Prevention (templates support), Cloud ML Engine, and BigQuery. See the full list of extensions here.

Data Management
Databases

13. Coming soon to Google Cloud: bring your existing SQL Server workloads to GCP and run them in a fully managed database service.

14. CloudSQL for PostgreSQL now supports version 11, with useful new features like partitioning improvements, stored procedures, and more parallelism.

15. Cloud Bigtable multi-region replication is now generally available, giving you the flexibility to make your data available across a region or worldwide as demanded by your app.

Storage

16. A new low-cost archive class for Cloud Storage will offer the same consistent API as other classes of Cloud Storage and millisecond latency to access your content.

17. Cloud Filestore, our managed file storage system, is now generally available for high-performance storage needs.

18. Regional Persistent Disks will be generally available next week, providing active-active disk replication across two zones in the same region.

19. Bucket Policy Only is now in beta for Google Cloud Storage, so you can enforce Cloud IAM policies at the bucket level for consistent and uniform access control for your Cloud Storage buckets.

20. V4 signatures are now available in beta for Google Cloud Storage to provide improved security and let you access multiple object stores using the same application code. In addition to HMAC keys, V4 signed requests are also supported for Google RSA keys.

21. Cloud IAM roles are now available for Transfer Service, allowing security and IT administrators to use Cloud IAM permissions for creating, reading, updating, and deleting transfer jobs.

Networking
22. Traffic Director delivers configuration and traffic control intelligence to sidecar service proxies, providing global resiliency for your services by allowing you to deploy application instances in multiple Google Cloud regions.

23. High Availability VPN, soon in beta, lets you connect your on-premises deployment to GCP Virtual Private Cloud (VPC) with an industry-leading SLA of 99.99% service availability at general availability.

24. 100 Gbps Cloud Interconnect connects your hybrid and multi-cloud deployments.

25. Private Google Access from on-premises to the cloud is now generally available, allowing you to securely use Google services like Cloud Storage and BigQuery as well as third-party SaaS through Cloud Interconnect or VPN.

26. With Network Service Tiers, Google Cloud customers can customize their network for performance or price on a per-workload basis by selecting Premium or Standard Tier.

Security & Identity
Security

27. Access Approval (beta) is a first-of-its-kind capability that allows you to explicitly approve access to your data or configurations on GCP before it happens.

28. Data Loss Prevention (DLP) user interface (beta) lets you run DLP scans with just a few clicks—no code required, and no hardware or VMs to manage.

29. Virtual Private Cloud (VPC) Service Controls (GA) go beyond your VPC and let you define a security perimeter around specific GCP resources such as Cloud Storage buckets, Bigtable instances, and BigQuery datasets to help mitigate data exfiltration risks.

30. Cloud Security Command Center, a comprehensive security management and data risk platform for GCP,  is now generally available,

31. Event Threat Detection in Cloud Security Command Center leverages Google-proprietary intelligence models to quickly detect damaging threats such as malware, crypto mining, and outgoing DDoS attacks. Sign up for the beta program.

32. Security Health Analytics in Cloud Security Command Center automatically scans your GCP infrastructure to help surface configuration issues with public storage buckets, open firewall ports, stale encryption keys, deactivated security logging, and much more. Sign up for the alpha program.

33. Cloud Security Scanner detects vulnerabilities such as cross-site-scripting (XSS), use of clear-text passwords, and outdated libraries in your GCP applications and displays results in Cloud Cloud Security Command Center. It’s GA for App Engine and now available in beta for GKE and Compute Engine.

34. Security partner integrations with Capsule8, Cavirin, Chef, McAfee, Redlock, Stackrox, Tenable.io, and Twistlock consolidate findings and speed up response. Find them on GCP Marketplace.

35. Stackdriver Incident Response and Management (coming soon to beta) in Cloud Security Command Center helps you respond to threats and remediate findings.

36. Container Registry vulnerability scanning (GA) identifies package vulnerabilities for Ubuntu, Debian, and Alpine Linux, so you can find vulnerabilities before your containers are deployed.

37. Binary Authorization (GA) is a deploy-time security control that integrates with your CI/CD system, gating images that do not meet your requirements from being deployed.

38. GKE Sandbox (beta), based on the open-source gVisor project, provides additional isolation for multi-tenant workloads, helping to prevent container escapes, and increasing workload security.

39. Managed SSL Certificates for GKE (beta) give you full lifecycle management (provisioning, deployment, renewal and deletion) of your GKE ingress certificates.

40. Shielded VMs (GA) provide verifiable integrity of your Compute Engine VM instances so you can be confident they haven't been compromised.

41. Policy Intelligence (alpha) uses ML to help you understand and manage your policies and reduce risk.

42. With Phishing Protection (beta), you can quickly report unsafe URLs to Google Safe Browsing and view status in Cloud Security Command Center.

43. reCAPTCHA Enterprise (beta) helps you defend your website against fraudulent activity like scraping, credential stuffing, and automated account creation and help prevent costly exploits from automated software.

Identity and access management

44. Context-aware access enhancements, including the launch of BeyondCorp Alliance, to help you define and enforce granular access to apps and infrastructure based on a user’s identity and the context of their request.

45. Android phone’s built-in security key—the strongest defense against phishing—is now available on your phone.

46. Cloud Identity enhancements, including single sign-on to thousands of additional apps and integration with human resource management systems (HRMS).

47. General availability of Identity Platform, which you can use to add identity management functionality to your own apps and services.

Smart Analytics
Data analytics

48. Data Fusion (beta) is a fully managed and cloud-native data integration service that helps you easily ingest and integrate data from various sources into BigQuery.

49. BigQuery DTS now supports 100+ SaaS apps, enabling you to lay the foundation for a data warehouse without writing a single line of code.

50. Cloud Dataflow SQL (public alpha) lets you build pipelines using familiar Standard SQL for unified batch and stream data processing.

51. Dataflow Flexible Resource Scheduling (FlexRS), in beta, helps you flexibly schedule batch processing jobs for cost savings.

52. Cloud Dataproc autoscaling (beta) removes the user burden associated with provisioning and decommissioning Hadoop and Spark clusters on Google Cloud Platform, providing you the same serverless convenience that you find in the rest of our data analytics platform.

53. Dataproc Presto job type (beta) helps you write simpler ad hoc Presto queries against disparate data sources like Cloud Storage and Hive metastore. Now both queries and scripts run as part of the native Dataproc API.

54. Dataproc Kerberos TLC (beta) enables Hadoop secure mode on Dataproc through thorough API support for Kerberos. This new integration gives you cross-realm trust, RPC and SSL encryption, and KDC administrator configuration capabilities.

55. BigQuery BI Engine, in beta, is a fully-managed in-memory analysis service that powers visual analytics over big data with sub-second query response, high-concurrency, simplified BI architecture, and smart performance tuning.

56. Connected sheets are a new type of spreadsheet that combines the simplicity of a spreadsheet interface with the power of BigQuery. With a few clicks, you can access BigQuery data in Sheets and securely share it with anyone in your organization.

57. BigQuery ML is now generally available with new model types you can call with SQL queries.

58. BigQuery: k-means clustering ML (beta) helps you establish groupings of data points based on axes or attributes that you specify, straight from Standard SQL in BigQuery.

59. BigQuery: import TensorFlow models (alpha) lets you import your TensorFlow models and call them straight from BigQuery to create classifier and predictive models right from BigQuery.

60. BigQuery: TensorFlow DNN classifier helps you classify your data, based on a large number of features or signals. You can train and deploy a DNN model of your choosing straight from BigQuery’s Standard SQL interface.

61. BigQuery: TensorFlow DNN regressor lets you design a regression in TensorFlow and then call it to generate a trend line for your data in BigQuery.

62. Cloud Data Catalog (beta), a fully managed metadata discovery and management platform, helps organizations quickly discover, manage, secure, and understand their data assets.

63. Cloud Composer (generally available) helps you orchestrate your workloads across multiple clouds with a managed Apache Airflow service.

AI and machine learning

64. AI Platform (beta) helps teams prepare, build, run, and manage ML projects via the same shared interface.

65. AutoML Natural Language custom entity extraction and sentiment analysis (beta) lets you identify and isolate custom fields from input text and also train and serve industry-specific sentiment analysis models on your unstructured data.

66. AutoML Tables (beta) helps you turn your structured data into predictive insights. You can ingest your data for modeling from BigQuery, Cloud Storage, and other sources.

67. AutoML Vision object detection (beta) now helps you detect multiple objects in images, providing bounding boxes to identify object locations.

68. AutoML Vision Edge (beta) helps you deploy fast, high accuracy models at the edge, and trigger real-time actions based on local data.

69. AutoML Video Intelligence (beta) lets you upload your own video footage and custom tags, in order to train models that are specific to your business needs for tagging and retrieving video with custom attributes.

70. Document Understanding AI (beta) offers a scalable, serverless platform to automatically classify, extract, and digitize data within your scanned or digital documents.

71. Vision Product Search (GA) lets you build visual search functionality into mobile apps so customers can photograph an item and get a list of similar products from a retailer’s catalog.

72. Cloud Vision API—bundled enhancements (beta) lets you perform batch prediction, and document text detection now supports online annotation of PDFs, as well as files that contain a mix of scanned (raster) and rendered text.

73. Cloud Natural Language API—bundled enhancements (beta) now includes support for Russian and Japanese languages, as well as built in entity-extraction for receipts and invoices.

74. Our new V3 Translation API lets you define the vocabulary and terminology you want to override within translations as well as easily integrate your added brand-specific terms into your translation workflows.

75. Video Intelligence API—bundled enhancements (beta) lets content creators search for tagged aspects of their video footage. The API now supports optical character recognition (generally available), object tracking (also generally available), and new streaming video annotation capability (in beta).

76. Recommendations AI (beta) helps retailers provide personalized 1:1 recommendations to drive customer engagement and growth.

77. Contact Center AI (beta) is helping businesses build modern, intuitive customer care experiences with the help of Cloud AI.

Windows workloads on GCP
78. For your Microsoft workloads, in addition to purchasing on-demand licenses from Google Cloud, you now have the flexibility to bring your existing licenses to GCP.

79. Velostrata 4.2, our streaming migration tool, will soon give you the ability to specifically tag Microsoft workloads that require sole tenancy, and to automatically apply existing licenses.

80. Coming soon, you’ll be able to use Managed Service for Microsoft Active Directory (AD), a highly available, hardened Google Cloud service running actual Microsoft AD, to manage your cloud-based AD-dependent workloads, automate AD server maintenance and security configuration, and extend your on-premises AD domain to the cloud.

81. We’ve expanded Cloud SQL, our fully managed relational database server, to support Microsoft SQL Server, and we’ll be extending Anthos for hybrid deployments to Microsoft environments.

Productivity & Collaboration
G Suite

82. Google Assistant is integrating with Calendar, available in beta, to help you know when and where your next meeting is, and stay on top of scheduling changes.

83. G Suite Add-ons, coming soon to beta, offer a way for people to access their favorite workplace apps in the G Suite side panel to complete tasks, instead of toggling between multiple apps and tabs.

84. Third-party Cloud Search, now generally available for eligible customers, can help employees search—and find—digital assets and people in their company.

85. Drive metadata, available in beta, lets G Suite admins, and their delegates, create metadata categories and taxonomies to make content more discoverable in search.

86. Hangouts Meet updates, including automatic live captions (generally available), the ability to make live streams “public” (coming soon), and up to 250 people can join a single meeting (coming soon).

87. Google Voice for G Suite, generally available, gives businesses a phone number that works from anywhere, on any device, that can also transcribe voicemails and block spam calls with the help of Google AI.

88. Hangouts Chat into Gmail, available in beta, lets team communications be accessed in one place on your desktop—the lower left section of Gmail which also highlights people, rooms, and bots.

89. Office editing in Google Docs, Sheets and Slides, generally available, lets you work on Office files straight from G Suite without having to worry about converting file types.

90. Visitor sharing in Google Drive, available in beta, provides a simple way for you to invite others outside of your organization to collaborate on files in G Suite using pincodes.

91. Currents (the new name for the enterprise version of Google+), available in beta, helps employees share ideas and engage in meaningful discussions with others across their organization, regardless of title or geography.

92. Access Transparency, generally available for G Suite Enterprise customers, to provide granular visibility into data that’s accessed by Google Cloud employees for support purposes.

93. We enhanced our data regions to provide coverage for backups.

94. Advanced phishing and malware protection, available in beta, help admins protect against anomalous attachments and inbound emails spoofing your domain in Google Groups.

95. Updates to the security center and alert center for G Suite provide integrated remediation so admins can take action against threats.

Chrome Enterprise

96. Chrome Browser Cloud Management lives within the Google Admin console, and it allows you to manage browsers in your Windows, Mac and Linux environments from a single location. You can see your enrolled browsers, and set and apply policies across them from the same place. We’ve opened up Chrome Browser Cloud Management to all enterprises, even if they aren’t using other Google products in their enterprise yet.

Customers
97. Hot off the presses: our 2019 Customer Voices book offers perspectives from 40 Google Cloud customers across 7 major industries.

98. Australia Post detailed how it delivers online and in-person for customers with the help of Google Cloud.

99. Baker Hughes is using Google Cloud to build advanced analytics products that solve complex industrial problems.

100. Colgate-Pamolive shared how it is using G Suite, and now GCP to transform its business, taking advantage of data analytics and  migrating its SAP workloads to Google Cloud.

101. Kohl’s described how it is moving most of its apps to the cloud in the next three years.

102. McKesson, a Fortune 6 company, shared its aim is to deliver more value to its customers and the healthcare industry through common platforms and resources.

103. Procter & Gamble shared how it is using Google Cloud to store, analyze, and activate its data.

104. Unilever used Google Cloud AI tools such as translation, visual analytics, and natural language processing (NLP) to generate insights faster and gain a deeper understanding of customer needs.

105. UPS described how it uses analytics on Google Cloud to gather and analyze more than a billion data points every day.

106. Viacom shared why it chose Google Cloud to perform automated content tagging, discovery and intelligence for more than 65 petabytes of content.

107. Whirlpool is using G Suite to completely transform the way its workforce collaborates.

108. Wix helps developers build advanced web applications quicker, smarter, and collaboratively with its Corvid development platform. On stage at Next, Wix demonstrated how developers can use Corvid and Dialogflow to build a chatbot in as little as 60 seconds.

Partnerships
109. Partners such as Cisco, Dell EMC, HPE, and Lenovo have committed to delivering Anthos on their own hyperconverged infrastructure for their customers. By validating Anthos on their solution stacks, our mutual customers can choose hardware based on their storage, memory, and performance needs.

110. Intel announced it will publish a production design for developers, OEMs and system integrators to offer Intel Verified hardware and channel marketing programs to accelerate Anthos deployment for enterprise customers.

111. VMware and Google Cloud announced a collaboration for SD-WAN and Service Mesh integrations with support for Anthos.

112. Our strategic open-source partnerships with Confluent, MongoDB, Elastic, Neo4j, Redis Labs, InfluxData, and DataStax tightly integrate their open source-centric technologies into GCP, providing a seamless user experience across management, billing and support.

113. Accenture announced an expanded strategic collaboration with new enterprise solutions in customer experience transformation.

114. Deloitte announced transformative solutions for the healthcare, finance, and retail sectors.

115. Atos and CloudBees announced a partnership to provide customers with a complete DevOps solution running on GCP.

116. Salesforce is bringing Contact Center AI to its Salesforce Service Cloud and Dialogflow Enterprise Edition to the Salesforce Einstein Platform.

117. A new integration with G Suite and Dropbox lets you create, save and share G Suite files—like Google Docs, Sheets and Slides—right from Dropbox.

118. Docusign introduced 3 new innovations to expand integration with GSuite.

119. We made a number of partner announcements around AI and machine learning, including Avaya, Genesys, Mitel, NVIDIA, Taulia, and UiPath.

120. We announced that 21 of our partners achieved specializations in our three newest specialization areas—with many more to come.

121. Our list of qualified MSPs is growing, and we introduced an MSP Initiative badge for qualified partners at Next ‘19, making it easier for our joint customers to discover partners who can help them to accelerate their Google Cloud journey.

122. We were thrilled to announce our 2018 partner award winners. You can find the full list here.

Add to this list our 123rd announcement: Google Cloud Next ‘20 will be happening from April 6-8 2020 back at Moscone in San Francisco. We hope to see you there!

Universal Pattern Explains Why Materials Conduct

Mathematicians have found that materials conduct electricity when electrons follow a universal mathematical pattern.

"In a wire, electrons rebound off each other in such a complicated fashion that there’s no way to follow exactly what’s happening.

But over the last 50 years, mathematicians and physicists have begun to grasp that this blizzard of movement settles into elegant statistical patterns. Electron movement takes one statistical shape in a conductor and a different statistical shape in an insulator.

That, at least, has been the hunch. Over the last half-century mathematicians have been searching for mathematical models that bear it out. They’ve been trying to prove that this beautiful statistical picture really does hold absolutely.

And in a paper posted online last summer, a trio of mathematicians have come the closest yet to doing so. In that work, Paul Bourgade of New York University, Horng-Tzer Yau of Harvard University, and Jun Yin of the University of California, Los Angeles, prove the existence of a mathematical signature called “universality” that certifies that a material conducts electricity.

“What they show, which I think is a breakthrough mathematically … is that first you have conduction, and second [you have] universality,” said Tom Spencer, a mathematician at the Institute for Advanced Study in Princeton, New Jersey.

The paper is the latest validation of a grand vision for quantum physics set forward in the 1960s by the famed physicist Eugene Wigner. Wigner understood that quantum interactions are too complicated to be described exactly, but he hoped the essence of those interactions would emerge in broad statistical strokes.

This new work establishes that, to an extent that might have surprised even Wigner, his hope was well-founded.

Universally Strange
Even seemingly isolated and unrelated events can fall into a predictable statistical pattern. Take the act of murder, for example. The stew of circumstances and emotions that combine to lead one person to kill another is unique to each crime. And yet someone observing crime statistics in the heat of an urban summer can predict with a high degree of accuracy when the next body will fall.

There are many different types of statistical patterns that independent events can follow. The most famous statistical pattern of all is the normal distribution, which takes the shape of a bell curve and describes the statistical distribution of a wide range of uncorrelated events (like heights in a population or scores on the SAT). There’s also Zipf’s law, which describes the relative sizes of the largest numbers in a data set, and Benford’s law, which characterizes the distribution of first digits in the numbers in a data set.

..."

Extract from an article written by Kevin Hartnett, Quanta Magazine May 6 2019




Mikes Notes


  • Normal distribution
  • Zipfs law
  • Benfords law
  • Obtain copy of paper

Google Ranking Factors

According to Brian Dean of Backlinko, "Google uses over 200 ranking factors in their algorithm" for search engine results.

He gives a list organised under these categories.
  • Domain Factors
  • Page-Level Factors
  • Site-Level Factors
  • Backlink Factors
  • User Interaction
  • Special Google Algorithm Rules
  • Brand Signals
  • On-Site Webspam Factors
  • Off-Site Webspam Factors

Mikes Notes

  • Is this correct?
  • How much does this change over time?

Jeff Heaton - Data Scientist

Jeff Heaton is a data scientist and instructor at Washington State and has an interesting and useful website Heaton Research.

"Working primarily with the Python, R, Java/C#, and JavaScript programming languages he leverages frameworks such as TensorFlow, Scikit-Learn, and Numpy to implement deep learning, random forests, gradient boosting machines, support vector machines, T-SNE, and generalized linear models (GLM)."

His website contains examples of nature-inspired algorithms.


His articles and blog posts and videos on code make complex subjects very easy to understand.

Mikes Notes

  • A gifted teacher

Enterprise Integration Patterns from Gregor Hohpe

Enterprise Integration Patterns is an excellent book by Gregor Hohpe and Bobby Woolf and describes 65 patterns for the use of enterprise application integration and message-oriented middleware in the form of a pattern language.

Published by Addison-Wesley, 2004.

Gregor Hohpe is currently a technical director in Google Cloud's Office of the CTO.

More at Wikipedia.

There is an excellent website for the book.

A 2016 interview with the authors on IEEE Software. A Decade of Enterprise Integration Patterns: A Conversation with the Authors. PDF is available.

Presentation by Gregor at YOW Singapore Conference 2017


Integration styles and types

The book distinguishes four top-level alternatives for integration:

  • File Transfer
  • Shared Database
  • Remote Procedure Invocation
  • Messaging


The following integration types are introduced:

  • Information Portal
  • Data Replication
  • Shared Business Function
  • Service Oriented Architecture
  • Distributed Business Process
  • Business-to-Business Integration
  • Tightly Coupled Interaction vs. Loosely Coupled Interaction


The graphic icons used in the patterns are also freely available from different sources.
  • Visio Stencils
Some additional graphics notes here.

Blogs

Software

Enterprise Integration Patterns are implemented in many open source integration solutions.

Commercial MOM Message Queuing Services in the cloud include

Mikes Notes