USA Democrats’ Plans To Control The Internet In 2 Ways Just Leaked, Could Wreak Serious Havoc.



Fact or Fiction (Control Narratives of all News in Society)
* Hence The Swift Return to the days of the TOWER of LONDON

{9003 arrested in the name of the king & taken by night to the tower} = “BEHEADED 9000”  3 lived


(to remain in Power for the Ruling Elite)

*{Mass Deception Government For The People “the mythical tale” believed by the masses when in reality the Truth will be “PRE-TRUMP STATUS QUO” “Government For The Rich And Powerful”}


(to Social Engineer the Moral Decay of Humanity to legitimise the Depravity of the those within the Ruling Class)

Human Trafficing
Sexual Slavery

Senate Democrats are circulating their plan to take control of the internet.
The plans include mandatory location verification, mandatory identity identification, and disclosure requirements for political speech.

Here is a link to the document. reported:

A leaked memo circulating among Senate Democrats contains a host of bonkers authoritarian proposals for regulating digital platforms, purportedly as a way to get tough on Russian bots and fake news. To save American trust in “our institutions, democracy, free press, and markets,” it suggests, we need unprecedented and undemocratic government intervention into online press and markets, including “comprehensive (GDPR-like) data protection legislation” of the sort enacted in the E.U.

Titled “Potential Policy Proposals for Regulation of Social Media and Technology Firms,” the draft policy paper—penned by Sen. Mark Warner and leaked by an unknown source to Axios—the paper starts out by noting that Russians have long spread disinformation, including when “the Soviets tried to spread ‘fake news’ denigrating Martin Luther King” (here he fails to mention that the Americans in charge at the time did the same). But NOW IT’S DIFFERENT, because technology.

“Today’s tools seem almost built for Russian disinformation techniques,” Warner opines. And the ones to come, he assures us, will be even worse.

Here’S How Warner Is Suggesting We Deal:

Mandatory location verification. The paper suggests forcing social media platforms to authenticate and disclose the geographic origin of all user accounts or posts.

Mandatory identity verification: The paper suggests forcing social media and tech platforms to authenticate user identities and only allow “authentic” accounts (“inauthentic accounts not only pose threats to our democratic process…but undermine the integrity of digital markets”), with “failure to appropriately address inauthentic account activity” punishable as “a violation of both SEC disclosure rules and/or Section 5 of the [Federal Trade Commission] Act.”

Bot labeling: Warner’s paper suggests forcing companies to somehow label bots or be penalized (no word from Warner on how this is remotely feasible)

Define popular tech as “essential facilities.” These would be subject to all sorts of heightened rules and controls, says the paper, offering Google Maps as an example of the kinds of apps or platforms that might count. “The law would not mandate that a dominant provider offer the serve for free,” writes Warner. “Rather, it would be required to offer it on reasonable and non-discriminatory terms” provided by the government.

Other proposals include more disclosure requirements for online political speech, more spending to counter supposed cybersecurity threats, more funding for the Federal Trade Commission, a requirement that companies’ algorithms can be audited by the feds (and this data shared with universities and others), and a requirement of “interoperability between dominant platforms.”


Original source: and

For The First Time, a US Company Is Implanting Microchips in Its Employees


We’re always hearing how robots are going to take our jobs, but there might be a way of preventing that grim future from happening: by becoming workplace cyborgs first.

A company in Wisconsin has become the first in the US to roll out microchip implants for all its employees, and says it’s expecting over 50 of its staff members to be voluntarily ‘chipped’ next week.

The initiative, which is entirely optional for employees at snack stall supplier Three Square Market (32M), will implant radio-frequency identification (RFID) chips in staff members’ hands in between their thumb and forefinger.

Once tagged with the implant, which is about the size of a grain of rice, 32M says its employees will be able to perform a range of common office tasks with an effortless wave of their hand.

“We foresee the use of RFID technology to drive everything from making purchases in our office break room market, opening doors, use of copy machines, logging into our office computers, unlocking phones, sharing business cards, storing medical/health information, and used as payment at other RFID terminals,”

The chips make use of near-field communication (NFC), and are similar to ones already in use in things like contactless credit cards, mobile payment systems, and animal tag implants.

The same kind of human implants made headlines when they were extended to employees at Swedish company Epicenter earlier in the year, but this is the first time they’ve been offered in the US across an organisation as large as 32M, which has 85 employees.

According to Westby, when staff were informed of the program, they reacted with a mixture of reluctance and excitement, but ultimately more than half elected to take part.

The costs of the implant amount to US$300 per chip – which the company says it will pay on the employees’ behalf – and the rollout could well be a sign of things to come, meaning employees would no longer need to carry around keys, ID cards, or smartphones to operate or authenticate with other systems.

As for security concerns and whether people ought to be worried about their employer tracking their movements, Westby says the chips don’t include a GPS component and are secure against hacking.

“There’s really nothing to hack in it because it is encrypted just like credit cards are,” he told ABC News.

“The chances of hacking into it are almost non-existent because it’s not connected to the internet. The only way for somebody to get connectivity to it is to basically chop off your hand.”

As if to prove the safety of the technology, the CEO says his wife and children will also receive the implants next week, coinciding with a “chip party” being held at the company’s headquarters in River Falls, Wisconsin.

If employees later change their minds, they’ll be able to have the implant removed – but that might not be enough to alleviate Big Brother-style privacy concerns held in some quarters.

While the chips might not track workers’ location by GPS, they nonetheless could give employers a huge amount of data about what employees do and when – like how often they take breaks or use the bathroom, what kind of snacks they buy, and so on.

On its own, that information might seem fairly harmless, but it’s possible that handing over even that level of information to your employer could one day pose problems – not to mention how the privacy issues could swell as the technology evolves.

“Many things start off with the best of intentions but sometimes intentions turn,” chairman and founder of data protection firm CyberScout Adam Levin told ABC News.

“We’ve survived thousands of years as a species without being microchipped, is there any particular need to do it now? … Everyone has a decision to make; that is, how much privacy and security are they willing to trade for convenience?”

For their part, the leaders of the companies kickstarting this workplace transition don’t seem to see what all the fuss is about.

“People ask me, ‘Are you chipped?’ and I say, ‘Yes, why not?'” Epicenter CEO Fredric Kaijser told Associated Press back in April.

“And they all get excited about privacy issues and what that means and so forth. And for me it’s just a matter of I like to try new things and just see it as more of an enabler and what that would bring into the future.”

In the meantime, 32M’s inaugural chip party is being held next Tuesday.

Clear your schedule, would-be cyborgs.

2017 Jul 25 | By Peter Dockrill  |


1.0: Wisconsin retail tech company offers to microchip its staff. | | – Science and Technology News via ACI – Scholarly Blog Index, 2017


2.0: Cyborgs at work: employees getting implanted with microchips | | – Science and Technology News via ACI – Scholarly Blog Index, 2017


3.0: Employees At This Tech Company Can Now Get Microchip Implants | | Doha Madani, The Huffington Post via ACI – Scholarly Blog Index, 2017


4.0: Human Chipping: Fishing for Uses | | Craig Klugman, ACI – Scholarly Blog Index,2015

Expand permanent human presence beyond low-Earth orbit. Go back to the Moon in 2021 and Mars by 2033

NASA Transition Authorization Act of 2017, a bill also known as S.442, into law in the Oval Office on March 21.

President Trump just signed a law that maps out NASA’s long-term future — but a critical element is missing

• President Trump has signed the NASA Transition Authorization Act of 2017.

• The new law calls to give NASA a $US19.5 billion budget and asks NASA to reach Mars by 2033.

• However, the law leaves out earth science, which the Trump administration intends to cut heavily.

S.442 is LAW March 21. 2017

For the first time in nearly seven years, the US government has passed a new long-term vision for NASA’s future.

President Donald Trump signed the NASA Transition Authorization Act of 2017, a bill also known as S.442, into law in the Oval Office on March 21.

The Senate and House had collaborated on the document for months, and it requests a $US19.5 billion-a-year budget for the space agency. (NASA received $US19.3 billion in 2016, or 0.5% of the total federal budget.)

In an image that Trump tweeted on Tuesday, the president said he’s “delighted to sign this bill reaffirming our national commitment to the core mission of NASA: human space exploration, space science, and technology.”

End of the atmospheric era?

However, that core mission is missing something that has been a part of the space agency for more than 58 years: earth science.

The 1958 document that formed NASA called upon the new space agency to contribute to the “expansion of human knowledge of phenomena in the atmosphere” — a mission that NASA, as Business Insider’s Rafi Letzter has reported, “carried out … with gusto under six Republican administrations and five Democratic ones.”

The new law doesn’t even mention earth science, which is troublesome considering what Trump’s administration has already laid out in its proposed budget for NASA released last week.

The budget would cut several major space agency initiatives, including the Office of Education, and seeks to terminate the:

 Plankton, Aerosol, Cloud, ocean Ecosystem (PACE)

Orbital Carbon Observatory-3 (OCO-3)

Deep Space Climate Observatory (DSCOVR)

Climate Absolute Radiance and Refractivity Observatory ‘Pathfinder missions.’ (CLARREO

These four satellites allow scientists to monitor and predict the behaviour of Earth’s weather, shifting climates, ocean ecosystems, and other vital aspects of our planet. They help save peoples’ lives, protect wildlife, and prepare America and other nations for long-term changes.

However, these things may or may not come to pass.

While S.442 is now a law, a long and complex budgeting process remains before NASA knows what its actual funding levels are for fiscal year 2018, which runs from October 1, 2017 through September 30, 2018. Trump’s proposed budget says NASA should receive $US19.1 billion per year, or $US400 million less than Congress’ law calls for.

What the new law says

The law asks NASA to create a plan for getting humans

“near or on the surface of Mars in the 2030s.”

It also calls on the space agency to continue developing the Space Launch System (SLS) — a behemoth rocket — and the Orion space capsule in order to eventually go to the moon, Mars, and beyond.

Trump has expressed support for a crewed exploration of Mars, and in his inauguration speech he said he’s “ready to unlock the mysteries of space.” Administration officials, meanwhile, have said they want NASA to return to the moon in the 2020s.

The American Astronomical Society has a convenient breakdown of the $US19.5 billion in the bill, including funding for human space exploration, space-station operations, science, and more.

Here are some notable titles, articles, and sections of the 146-page document:

 Assuring Core Capabilities For Exploration — calls for several missions: an uncrewed launch of SLS and Orion in 2018, followed by a crewed mission to the moon in 2021, and further trips to the moon and Mars after that date.

 Journey to Mars — asks NASA for a roadmap to send people to Mars by 2033; also steers the space agency away from pursuing the Asteroid Redirect Mission (a plan to capture an asteroid, tow it into orbit around Earth, and have astronauts explore the space rock).

 Human Space Flight And Exploration Goals And Objectives — says it’s the mission of NASA to “to expand permanent human presence beyond low-Earth orbit.”

 Aeronautics — calls on NASA to be a leader in aviation and hypersonic aircraft research; also asks the space agency to look into supersonic-aircraft research that would “open new global markets and enable new transportation capabilities.”

 Mars 2020 rover — Congress backs up NASA’s plan to use the car-sized rover to “help determine whether life previously existed on that planet.”

 Europa — approves of NASA’s plan to send an orbiting satellite to Jupiter’s ice-covered moon Europa, which may have a warm subsurface ocean (and possibly host alien life).

 Congressional declaration of policy and purpose” — amends previous laws to make it part of NASA’s mission to “search for life’s origin, evolution, distribution, and future in the universe.”

 Extrasolar planet exploration strategy — asks NASA to explain how it will use the James Webb Space Telescope and other instruments to hunt for exoplanets.

 Near-Earth objects — asks NASA to accelerate its program to find killer asteroids in space.

 Radioisotope power systems — implores NASA to deliver a report on how it plans to make plutonium-238 — an exceedingly rare nuclear fuel for deep-space robots — and detail what its nuclear-powered exploration plans are.

Date-stamped: 2017, March, 22. | By: Dave Mosher | Source: | Article Tittle: President Trump just signed a law that maps out NASA's long-term future -- but a critical element is missing

Sheriff Joe Arpaio Releases New Information on President Obama’s Birth Certificate

Fake Documents feature

Followed the Evidence it determines the direction of the investigation.

The Document is a
Fraudulent Document.

Reed Hayes: Forensic Handwriting Examination

ForLab Multimedia Forensics Laboratory

Clear the Document was the prime scope. 40 year experienced person court recognised expert to examine and he he said theres something wrong.



Comet Siding Spring P/2004 V3 (Planet X) or Ninth Planet?

Planet IX (9) feature

Comet[1]designation Name ECC[2]Eccentricity SMA[3]Semimajor axis Period[4]years INCL (°)[5]Inclination PD-AU[6]Perihelion dis. AU ABM[7]Absolute mag. Last P[8]perihelion Next P.UNO[9]perihelion unobserved
 P/2004 V3 Siding Spring 0.4467 7.117 18.99 50.4529 3.93839 15.8 2004/11/11 2023/11/07


Name: Designation: Object Type:
(Siding Spring) P/2004 V3 Comet[10]

Current Position:

Constellation: Right Ascension: Declination:
Leo 11h46m +14°46′

Orbital Elements:

Semi-major axis: Eccentricity: Inclination: Longitude ascending node: Argument of perihelion: Epoch of elements: Mean Anomaly at epoch:
7.15 AU 0.444687 50.36° 356.10° 322.31° 16 November 2016 227.61°


Derived quantities:

Perihelion: Aphelion: Orbital period:
3.97 AU 10.33 AU 19.11 years


1 designation
2 Eccentricity
3 Semimajor axis
4 years
5 Inclination
6 Perihelion dis. AU
7 Absolute mag.
8 perihelion
9 perihelion unobserved

Galileo got it right. Falling Objects and No Air Resistance

FYI For Your Interest (01) feature

Galileo in 15th century, discovered that any objects falling to earth, fall at the same rate of time . He mentioned that a cannonball and a feather, if dropped from the same height will touch the ground at the same time provided there is no air resistance. He had difficulty explaining it for quite a long time .

4 centuries later with the current Technology it has been experimentally demonstrated. 

Its a great visual treat to watch


Humans 2.0: How the robot revolution is going to change how we see, feel, and talk.

FYI For Your Interest (01) feature
THE year is 2025.

You’re sitting in a surgery watching your doctor carefully insert the tips of her fingertips into black thimble-like actuators.

The gelatinous mass you feel coming to life inside you found its way into your body 24 hours earlier, when you swallowed a pill that looked unremarkable, save for its bulk.

That pill was actually a package of edible electronics, a miniature robot that will allow the doctor to feel inside your body without making a single incision.

This is the coming world of augmented humans, where technology gifts people senses, skills, and strengths never before available.

The swallowable robot is only one scenario that researchers in Bristol in the west of England are working to make a reality, as part of research that seeks to use bots to enhance, rather than replace, people.

Other projects include work to allow surgeons to operate on people located miles away with superhuman precision, and managers to split their day between offices situated on opposite sides of the world.

The conversation about robots today so often revolves around fears of how they will replace us, rather than help us.

[otw_shortcode_quote border=”bordered” border_style=”bordered” color_class=”otw-blue-text”]Just as humans like you and I are not able to do everything and don’t know about everything, robots will always have limitations.[/otw_shortcode_quote] MANUELA VELOSO, CARNEGIE MELLON UNIVERSITY

Yet as the research taking place at Bristol shows, robotics is “more about augmenting people than it is about making them obsolete,” says Professor Anthony Pipe, deputy director of Bristol Robotics Laboratory.

He sees this research as reflecting a future where robots and humans enjoy a more symbiotic relationship—where robots work alongside people, enhancing their capabilities.

“There are lots of areas where robots could help humans do things,” said Pipe. “That’s really one of the big new areas. So as opposed to replacing humans, helping humans will be a large area for growth.”

Pipe talks about “human-robot teams” working together. “We’re not saying the robot suddenly becomes a simulacrum of a human being—it’s still a robot doing the dumb things and being instructed by a human being—but it may be able to do more useful and skillful things than robots have been used to do so far.”

He is not alone in his assessment that robots will routinely collaborate with people. In the US, professor Manuela Veloso of Carneige Mellon University has built CoBots, wheeled bots that automatically escort people through the university building but ask people for help when needed—or instance, to call the elevator for them.

Just as bots can help people, so they will likely always need humans, Veloso said—whether it’s an automated car that needs a person to take the wheel during snowy weather or a robotic warehouse picker that can’t get a grip on a slippery object.

“Just as humans like you and I are not able to do everything and don’t know about everything, robots will always have limitations,” said Veloso. “The thing would be to continue developing algorithms in which the robots themselves are useful but capable of asking for help.”

The swallowable robot—called the MuBot—has been the focus of researcher Ben Winstone’s work at Bristol Robotics Lab in the west of England.

In effect, the device would transplant the tips of the doctor’s fingers onto its exterior, so when the robot pushes against the inside of the intestinal tract, the doctor would feel the sensation as if his or her own fingers were pressing the flesh. Using this device, doctors of the future could feel for the telltale outline of tumors and other cancerous growths in patients.

“Medical practitioners have spent years developinga highly enhanced sense of touch to allow them to carefully palpate tissue and recognise suspect lumps and bumps,” said Winstone.

“If you could take their hands and put it inside the body without opening the body up, then they can start to feel around and have an idea what’s going on,” he said.

Allowing clinicians to feel at a distance required Winstone and his collaborators to build an electromechanical fingertip on the outside of the robot. Inside the bot are an array of pins that replicate the biological features found on the internal surface of human skin. It is these pins that stimulate the receptors responsible for letting our fingertips feel. When we detect the shapes of objects, we use the Meissner’s Corpuscle, a mechanoreceptor that sits close to the surface of the skin and measures how it deforms when pressed. Similarly, when we detect how rough a surface is we rely on the Pacinian corpuscle, which acts a bit like a microphone in sensing the vibrations upon touch.

A cross section of the robotic pill. Image: Ben Winstone, Bristol Robotics Laboratory

When the soft-skinned bot presses against the intestinal wall, these pins are pushed inwards and vibrate in much the same way receptors do inside our fingertips.

Attaching sensors to each of these pins would require electronics that were too complex, power-hungry and delicate. So instead the bot relies on a camera that captures the pins’ stirrings and relays the footage to a computer that calculates what touching that gut wall would feel like based on the movement of the pins.

The bot isn’t static but remote-controlled. Using a live feed from the bot’s video camera, the clinician can guide the tiny craft through the patient’s gut, pressing up against areas of interest. As a way of moving the robot, Winstone is drawing on biology for inspiration and examining how worms propel themselves forward by flexing the muscles along the length of their body, something called peristaltic motion.

“We’re looking at using peristaltic locomotion because it complements a soft bodied robot that can comply with the twists, turns and contractions of the gut as it is moving along and it doesn’t obstruct,” he said.

The sensation of touch then needs to be transferred to the doctor. For this task, a wearable haptic ‘fingertip’ is used, again lined with pins. Using the data harvested from the bot a computer arranges these pins into a 3D model of the intestinal wall. In this way, the doctor can feel what it would be like to be exploring the inside of the intestine with his fingers. Another benefit is that once the shape of the intestinal wall has been captured, the doctor or a colleague can rerun the recording and probe the intestine as many times as needed.

“There’s no way to look around deep inside the body without opening people up, so it’s a really interesting and exciting opportunity to see what can be done,” said Winstone. “You could save money on medical procedures if you discover you don’t need to take it further because you know the situation is safe, or you could help people sooner and more effectively by identifying something more quickly.”

Winstone also believes that people would be more inclined to get symptoms checked if resolving them meant “swallowing a slightly large pill” instead of invasive surgery.

Of course, that “slightly large pill” is in reality a robot and the thought of stuffing a machine down your gullet is understandably alarming.

The difference is that this pill is what is referred to as a soft robot, said Winstone.

“There’s often the idea that robots are hard, tin men. There’s a whole field of robotics made out of metal but a new approach to robotics is being realised that uses smart soft materials for both sensing and actuating,” he said. “It is more natural and more suitable for interacting with living beings, so it is much safer.”

In the case of the robot pill, Winstone has been experimenting with encapsulating the camera and pin-based tactile sensor in gel surrounded by a rubber exterior.

Then there’s the thorny issue of how to power the bot. Winstone had the problem of the bot either needing batteries that were too bulky to swallow or too feeble to last the required time. As a solution, he is looking into wirelessly transmitting the power to the bot through the patient using magnetic resonance induction.

“I’m looking at magnetic resonance induction to power the robots inside the body. That means you’re not dependent on batteries and you have the opportunity to charge. Essentially it means you have power for as long as you need it.”

Winstone estimates that he’ll have a “relatively good proof of concept” by next year and that, if everything goes to plan, there could be a system that patients could use within about 10 years.

Your life as a bot

Science fiction is full of stories where people live vicariously, sitting in virtual reality pods from where they control robotic avatars that can perform seemingly impossible tasks safe in the knowledge that any damage—or even death—is virtual.

The system that Dr. Paul Bremner is making may be a long way from letting us live these second lives, but perhaps is a first step. At the lab in Bristol, Bremner is making a rig that allows someone to control a robot in a different room and maybe eventually, from a different city or even country.

It’s a system that works today. Visit Bremner at the lab and you can strap an Oculus Rift virtual reality headset to your face and look through the eyes of a robot.

The robot in question is an Aldebaran Robotics Nao, a not-especially-imposing android standing at just under two feet tall with a permanently surprised look on its face. While this robotic avatar may view the world from the perspective of a toddler, the system still offers an out-of-body experience. Turn your head and so does the Nao, lift your arm and—thanks to tracking by a Microsoft Kinect—so does the Nao.

[otw_shortcode_quote border=”bordered” border_style=”bordered” color_class=”otw-blue-text”]If you’re interacting with someone who is themselves an extrovert, when you do a gesture, the robot does a large gesture.[/otw_shortcode_quote] PAUL BREMNER, BRISTOL ROBOTICS LABORATORY

Gazing at the world through the bot’s eyes—actually two stereo 720p cameras—is at once peculiar and engaging, particularly turning your head to see yourself standing next to you.

One application Bremner can eventually see for the remote robotics technology is giving managers the ability to drop into offices situated hundreds or even thousands of miles apart—all without leaving their houses.

“That’s nominally one of the things that you want to be doing with it. Rather than having a Skype conversation, you have the conversation with the robot as your avatar,” said Bremner.

Of course, taking the boss seriously when he’s made of white plastic and only comes up to your knee would be tricky, but Nao’s limited stature shouldn’t be a problem. Bremner should be able to take the system he develops for Nao and transfer it to a bigger, more relatable bot.

There’s a long way to go in getting a bot to capture the subtleties of body language—the narrowing of the eyes, the pursing of the lips, the opening of the palms. In contrast Nao can open and close its hands and wears a single expression of open-mouthed wonder.

That’s why Bremner is looking at other robots such as the 3D-printed Poppy, who is twice the height of Nao, as well as the more expressive Robothespian, whose facial features can be modified using backprojection. For more expressive gestures, Bremner is considering fitting a bot with custom hands that can make a greater range of shapes.

While robots with their stiff joints and fixed faces may lack the expressiveness of a human, their ability to gaze at someone and reproduce limited arm gestures is a step up from telepresence robots today, said Bremner.

“The issue, I think, with those systems is they’re basically just Skype on wheels,” he said.

Lots of the subtle cues of face-to-face interaction are lost as a result, Bremner added. You don’t know exactly who the person is focusing on and it’s harder to keep people’s attention when you’re a screen on a pole.

Bremner’s robots could also replace some of the feedback lost by not speaking to someone face-to-face, through superimposing messages on your vision to tell you how it gauges the conversation is going.

“We could overlay that on your vision so you can have a better idea of how the interaction is going and what changes you need to make,” he said.

The bot could even go beyond reproducing your arm and head movements to exaggerating the bot’s motions to help you get on with the person you’re talking to.

“We want to be able to add some semi-autonomy to the robot control,” said Bremner, “so that if you’re interacting with someone who is themselves an extrovert, when you do a gesture, the robot does a large gesture.”

Similarly, Bremner’s partners at Queen Mary University of London are studying how to tailor the robot’s gestures to suit the mood suggested by the speaker’s voice or to stress a particular point.

Bremner is particularly interested in how people’s reaction to someone changes when they are embodied by a robot, and whether people would still respect and listen to their boss as a bot.

“What effect does that have on people’s personality perception, group interaction, and that kind of stuff,” he said.

At present the system is far away from any practical use. Bremner is currently using it to study how people react when interacting with someone’s robotic avatar and thus laying the groundwork

controllable robot

A controllable robot from Bristol Robotics Laboratory. Image: Nick Heath/TechRepublic

for future interactions between, remote-controlled and autonomous, robots and humans.

Bremner said, “The idea is to gather a lot of information on how people behave when they are controlling the robot and how the interaction is successful, so we can build up this domain of knowledge for an expert system.”

A major technical hurdle facing Bremner is how to remove the wires between the bot and the PC that currently processes some of the images. Once that is achieved, the robot will be able to be mobile, rather than stationary.

Once cracked, Bremner wants the robot to roll before it can walk and expects the first movable bot will be wheeled, with people controlling it using a Segway-like motion where they lean in the direction they want it to travel.

The remote robots will get their first real-world test soon, with Bremner planning to see how people react to the bot in team-building exercises, such as desert island survival tasks.

Getting participants shouldn’t be a problem. When Bremner has shown it off the reaction has been one of excitement.

“Most people are really like, ‘Wow, this is something really exceptional’,” he said.

Robotic surgery

The versatility of the human hand is thought to have played a role in our rise to become the dominant species on Earth.

But hands have their limitations, particularly when attempting to carry out precision work such as laparoscopic surgery, where doctors operate using a few small incisions rather than a large open one.

This minimally invasive surgery causes less blood loss and residual pain in patients and means that procedures that used to require patients to stay weeks in hospital can be recovered from far more quickly. However, such work not only requires a steady hand but the use of multiple tools and assistants.

Today robotic systems such as the da Vinci Surgical System give surgeons the ability to carry out such operations with improved precision and less bleeding. These robotic systems offer these improvements by allowing the surgeon to remotely-control robot hands capable of far more exact movements than humans.

Although such systems are now becoming commonplace when carrying out the delicate task of removing a prostate, for example, there is room for improvement in certain areas.

One such area is training. Typically it takes surgeons about 2,000 hours to become proficient with da Vinci robots, according to doctoral researcher Antonia Tzemanaki, something that can take between a year and five years for a doctor to accumulate.

exoskeleton hand

The exoskeleton worn by surgeons to control the robotic hand. Image: Antonia Tzemanaki, Bristol Robotics Laboratory

To help such systems become more dextrous and intuitive to use, Tzemanaki and her colleagues at Bristol are developing a robotic system that, when compared to the “pliers” and “scissors” of the da Vinci machine, more closely mimics the movements of a human hand.

The team is building what looks like a robot claw with three digits. Each digit can hold one of 13 specialist instruments for different operations.

The surgeon puts their hand inside an exoskeleton with magnetic sensors that measure the hand’s position. The exoskeleton then relays the hand’s movements to the robotic hand and maps the movement of the surgeon’s fingers to each of the robotic digits with their attached instruments. If the surgeon moves their thumb, index finger, or middle finger then those movements will be reproduced by the robotic hand’s thumb, index or middle finger.

By closely mimicking the movement of the doctor’s hands, the team believes the instrument will be more straightforward to learn how to use.

For example, in an operation to remove a gall bladder, a surgeon would generally require an assistant to hold the gall bladder out of the way while the surgeon cuts tissue.

With the robotic hand, the surgeon can instead use an instrument attached to one of the digits to hold the gall bladder up, while using instruments attached to another robotic hand to apply traction and cut.

“The instrument shrinks down the surgeon’s hand and lets them operate with a superhuman level of precision,” said Tzemanaki. “Their movements will be miniaturised and filtered to make them more accurate. So there’s no tremor, there’s more precision and of course at the end of these digits there can be different instruments.”

Each robot hand has three instruments, mimicking a partial human hand. The first two instruments—the forefinger and thumb—act as grippers. The remaining “middle finger” can house any number of tools, with a blade, hook, irrigation device, and coagulation device among the many options.

Together, the two robot hands give the surgeon up to six instruments to use simultaneously and allow the doctor to mix and match the instruments they need. For example, on one hand the index and thumb could be needle holders, while acting as forceps on another—with each hand allowing for one person to operate three different instruments at the same time.

“Whatever the surgeon is doing is reproduced in the instrument but better,” said Tzemanaki.

By mirroring the pinches and grips a human hand is capable of, the system improves on the dexterity and usability of current state-of-the art robots for carrying out laparoscopic operations, such as the da Vinci Surgical System, she said.

If such robotic systems become commonplace, then there will also be an opportunity to overlay information from medical scans onto the video feed showing the patient, giving the surgeon more information to aid them in carrying out the operation.

Looking further into the future, such a system could benefit from the research Winstone is doing to allow doctors to remotely experience a sense of touch.

“In my PhD we’re talking about kinematics, the designing of intelligent instruments,” said Tzemanaki. “But then the absolute next and crucial step is to give a sense of touch. [The point is to make it] as natural and as close as possible to the movement of the human hand.”

As with all medical projects, the raft of regulatory approvals needed before the technology could be released means it is likely many years from being made available, perhaps as many as 20, said Tzemanaki.

robotic hand for surgical procedure

The robotic hand that holds the surgical instruments. Antonia Tzemanaki, Bristol Robotics Laboratory

Better together

The way robotic technologies can and will augment human abilities is sometimes lost amid concerns people will be unable to compete in a world of smart machines.

And while the impact of fast-approaching automation, drones, and robots on industries such as haulage, delivery, and retail is yet to be felt, the projects at Bristol demonstrate ways that people and robots can achieve more by working together rather than in competition.

Professor Erik Brynjolfsson, the MIT economist warning societies to prepare for the upheavals automation will trigger in the job market, calls this symbiotic relationship between humans and computers “Racing with machines.”

A powerful demonstration of the concept could soon be realised by the robotic exoskeletons being made by the likes of Ekso Bionics and ReWalk. While two-legged robots currently struggle to stay upright by themselves, a human in a robotic exoskeleton promises to combine the strength of a machine with the balance of a person and may one day allow the injured and infirm to walk with ease and help construction workers and soldiers carry back-breaking loads.

In the immediate future, bots are likely to continue to suffer from significant limitations. Today, robots have difficulty with manual tasks that we find simple, such as picking items from cluttered warehouse shelves, and roboticists predict their creations will find such tasks difficult for years to come. But use a robotic system to enhance a person’s capabilities and let the human fill in the gaps in the bot’s skills, and the result could be something far greater than the sum of its parts.

[otw_shortcode_quote border=”bordered” border_style=”bordered” color_class=”otw-blue-text”]As we make technological progress, people will find jobs that before were inconceivable.[/otw_shortcode_quote] DR. PETER LEDOCHOWITSCH, ALLEN INSTITUTE FOR BRAIN SCIENCE

In the immediate future, bots are likely to continue to suffer from significant limitations. Today, robots have difficulty with manual tasks that we find simple, such as picking items from cluttered warehouse shelves, and roboticists predict their creations will find such tasks difficult for years to come. But use a robotic system to enhance a person’s capabilities and let the human fill in the gaps in the bot’s skills, and the result could be something far greater than the sum of its parts.

For Dr Peter Ledochowitsch, scientist at the Allen Institute for Brain Science, it is a reflection of how biological and synthetic systems complement each other’s strengths, as demonstrated by a remote-controlled cyborg beetle, which can fly far longer than any similar-sized electronic device.

By combining the strongest elements of natural and man-made creations, he believes we can continue to transcend the limits we face as humans.

“I adhere to the idea that as we make technological progress, people will find jobs that before were inconceivable,” said Ledochowitsch.

It’s a notion that father of robotics, the late Joseph Engelberger, would have agreed with. The founder of one of the world’s first robot manufacturers, he was a staunch opponent of the idea that bots would diminish human existence by taking away jobs.

To the contrary, he saw robots as freeing people from humdrum busywork and empowering them to achieve more.

Robots don’t destroy employment, he told The New York Times, they “take away subhuman jobs which we assign to people” and in doing so give them the time and the tools to be better humans.

Cover image credit: iStockphoto/video-doctor.

by Nick Heath | Source: Humans 2.0: How the robot revolution is going to change how we see, feel, and talk - TechRepublic

‘We Won’t Make Frankensteins,’ Cloning Giant Boyalife’s CEO Says

FYI For Your Interest (01) feature

BEIJING — The head of a Chinese firm that is building the world’s biggest animal cloning factory has vowed not to use the technology on people — for now, at least.

Biotech company Boyalife Group’s $30 million facility in the coastal city of Tianjin will produce embryos of cattle as well as racehorses and contraband-sniffing dogs when it becomes operational next year.

No, we don’t do human cloning, we won’t make Frankensteins,

said Dr. Xu Xiaochun, its chief executive.

The technology we have is very advanced … [but if uncontrolled] technology can also do damage … Every technology has to have a boundary.

As a 12-year-old, Xu became fascinated with plant cloning. Now aged 44, he is leading China’s charge to become a world leader in cloning technology.


“Our primary focus is prime quality beef,” Xu told NBC News in an exclusive interview, noting that China’s cattle industry hasn’t traditionally focused on meat production.

However, beef consumption is currently growing at double-digit rates in the country, with imports increasing due to the low quality of China’s domestic beef.

The Tianjin plant will initially produce a 100,000 embryos of prime beef cattle per year. That figure is eventually expected to rise to 1 million embryos annually, which will make it the planet’s largest animal-cloning operation.

The bomb-sniffing dogs have become “a major force in anti-terrorism campaigns,” Xu said, with about 600 cloned dogs working globally.

Three cloned puppies snuggle in an incubator at a Boyalife Group facility in Tianjin, China, in 2014. BOYALIFE GROUP / AFP – Getty Images

“We will only select the really top dogs for cloning like selecting only those who could go to Harvard or Peking University,” he added.

But Xu suggested society was not yet ready to embrace reproductive human cloning.

“Technology is moving very fast … [and] social values can change,” he said. “Maybe in 100 years, in 200 years, people will think differently. [They] may think this technology is going to benefit the human race as a whole … Boyalife will move along with social values.”

Xu added: “Different people have different characters … We want to keep this diversity. We really don’t want the entire society to become one billion Einsteins.”

Dec 26 2015 | by David Lom and Eric Baculinao | Source: “We Won’t Make Frankensteins,’ Cloning Giant Boyalife’s CEO Says”

Li-Fi has just been tested in the real world, and it’s 100 times faster than Wi-Fi

Expect to hear a whole lot more about Li-Fi – a wireless technology that transmits high-speed data using visible light communication (VLC) – in the coming months. With scientists achieving speeds of 224 gigabits per second in the lab using Li-Fi earlier this year, the potential for this technology to change everything about the way we use the Internet is huge.

And now, scientists have taken Li-Fi out of the lab for the first time, trialling it in offices and industrial environments in Tallinn, Estonia, reporting that they can achieve data transmission at 1 GB per second – that’s 100 times faster than current average Wi-Fi speeds.

“We are doing a few pilot projects within different industries where we can utilise the VLC (visible light communication) technology,” Deepak Solanki, CEO of Estonian tech company, Velmenni, told IBTimes UK.

“Currently we have designed a smart lighting solution for an industrial environment where the data communication is done through light. We are also doing a pilot project with a private client where we are setting up a Li-Fi network to access the Internet in their office space.”

Li-Fi was invented by Harald Haas from the University of Edinburgh, Scotland back in 2011, when he demonstrated for the first time that by flickering the light from a single LED, he could transmit far more data than a cellular tower. Think back to that lab-based record of 224 gigabits per second – that’s 18 movies of 1.5 GB each being downloaded every single second.

The technology uses Visible Light Communication (VLC), a medium that uses visible light between 400 and 800 terahertz (THz). It works basically like an incredibly advanced form of Morse code – just like switching a torch on and off according to a certain pattern can relay a secret message, flicking an LED on and off at extreme speeds can be used to write and transmit things in binary code.

And while you might be worried about how all that flickering in an office environment would drive you crazy, don’t worry – we’re talking LEDs that can be switched on and off at speeds imperceptible to the naked eye.

The benefits of Li-Fi over Wi-Fi, other than potentially much faster speeds, is that because light cannot pass through walls, it makes it a whole lot more secure, and as Anthony Cuthbertson points out at IBTimes UK, this also means there’s less interference between devices.

While Cuthbertson says Li-Fi will probably not completely replace Wi-Fi in the coming decades, the two technologies could be used together to achieve more efficient and secure networks.

Our homes, offices, and industry buildings have already been fitted with infrastructure to provide Wi-Fi, and ripping all of this out to replace it with Li-Fi technology isn’t particularly feasible, so the idea is to retrofit the devices we have right now to work with Li-Fi technology.

Research teams around the world are working on just that. Li-Fi experts reported for the The Conversation last month that Haas and his team have launched PureLiFi, a company that offers a plug-and-play application for secure wireless Internet access with a capacity of 11.5 MB per second, which is comparable to first generation Wi-Fi. And French tech company Oledcomm is in the process of installing its own Li-Fi technology in local hospitals.

If applications like these and the Velmenni trial in Estonia prove successful, we could achieve the dream outlined by Haas in his 2011 TED talk below – everyone gaining access to the Internet via LED light bulbs in their home.

“All we need to do is fit a small microchip to every potential illumination device and this would then combine two basic functionalities: illumination and wireless data transmission,” Haas said. “In the future we will not only have 14 billion light bulbs, we may have 14 billion Li-Fis deployed worldwide for a cleaner, greener, and even brighter future.”

24 NOV 2015 | BEC CREW | Source: "Li-Fi has just been tested in the real world, and it's 100 times faster than Wi-Fi"