The Sewbot Robot will Produce Millions of ‘Made in USA’ T-shirts

In 2018, the Chinese group Tianyuan Garments will open a brand new factory in Arkansas, home to an army of self-sufficient Sewbots robots capable of manufacturing nearly 1.2 million t-shirts a year – directly on American soil – under the supervision of a handful of technicians only.

Last year, President Trump based much of his election campaign on his ability to repatriate some of the companies producing manufactured goods at lower cost in foreign countries (read in CNN), notably in China.

Regularly the target of the ire of the American leader (read here), the Middle Empire is described by many as being a country using social dumping to produce cheap products subsequently reso

Advertisements

In 2018, the Chinese group Tianyuan Garments will open a brand new factory in Arkansas, home to an army of self-sufficient Sewbots robots capable of manufacturing nearly 1.2 million t-shirts a year – directly on American soil – under the supervision of a handful of technicians only.

Last year, President Trump based much of his election campaign on his ability to repatriate some of the companies producing manufactured goods at lower cost in foreign countries (read in CNN), notably in China.

Regularly the target of the ire of the American leader (read here), the Middle Empire is described by many as being a country using social dumping to produce cheap products subsequently resold in the United States, such as clothing carrying her daughter’s claw (read in Newsweek).

The situation is changing. But not in the direction hoped for by The Donald.

A stand alone robot

Indeed, the Chinese Tianyuan Garments Company will open its new factory in 2018, in the middle of the US territory. In Arkansas, 21 production lines made up of weavers will be able to manufacture 100,000 t-shirts a month.

With this production rate, the Chinese factory will be able to compete with the costs of making t-shirts in China and then cargo transport to their place of sale.

The plant will be one of the first in the world to use SewBot machines, developed by SoftWear Automation, based in Atlanta. Eventually, this process could transform the landscape of the world textile industry.

The Sewbot robot was developed at the Research Center for Advanced Technologies of the University of Georgia Tech, in a program launched there nearly a decade. In 2012, researchers were finally awarded a grant by DARPA – the Department of Innovation integrated with the US Department of Defense – to develop the process for its commercialization.

By 2015, Softwear Automation was marketing a simpler version of its robot weaver, able to produce bath mats or towels at an incredible rate.

The evolution of this machine, the stand-alone robot deployed within the plant in Little Rock (Arkansas), will now be able to manufacture t-shirts and partially produce jeans pants.

The customer of Softwear Automation, the Chinese Tianyuan Garments Company, has already indicated that the goal was to produce the equivalent of 800,000 t-shirts a day with its fleet of machines. A figure is hardly believable since it is robots autonomous.

The death of textile jobs?

To maintain and supply and maintain this precision machinery, the plant should create about 400 jobs. But it is, of course, a figure not commensurate with the volume of employees needed for a more ‘traditional’ production.

Moreover, SoftWear Automation is trying to change the idea that its weaving robots are preparing to cause a real slaughter in the textile sector.

According to a study carried out in-house, the manufacturer explains that a robot such as the SewBot generates between 50 and 100 jobs in its value chain, in particular because it makes it possible to obtain the label ‘Made In USA’ and leaves the opportunity for the clothing brand to invest in the purchase of local raw materials, increasing the demand for labor in the surrounding area.

Apparently, this version seems a bit optimistic. But the Sewbot has other advantages: the Fashion for Good initiative, which tries to make the textile sector aware of environmental issues, estimates that the Sewbot could help reduce the sector’s emissions by about 10%.

Research Shows Young Parents Prefer Robots to Care For Them in Old Age

If before the parents loved the idea of being looked after by their children as they age, the generation of the “fathers of the millennium” follows a very different way of thinking.

For them, enlist the help of artificial intelligence or even robots is the preferred option to depend on their – the so – called “alpha child” born from 2010.

At least this is what indicates a research done by the IEEE on the impact of the AI on the generations of parents and children. The study, involving 600 fathers and mothers between the ages of 20 and 36, found that 63 percent of them would rather have a robot to help them than rely on their offspring for independent living as a senior.

From robot-babysitters to AI teachers

This preference for robots is not only limited independent li

If before the parents loved the idea of being looked after by their children as they age, the generation of the “fathers of the millennium” follows a very different way of thinking.

For them, enlist the help of artificial intelligence or even robots is the preferred option to depend on their – the so – called “alpha child” born from 2010.

At least this is what indicates a research done by the IEEE on the impact of the AI on the generations of parents and children. The study, involving 600 fathers and mothers between the ages of 20 and 36, found that 63 percent of them would rather have a robot to help them than rely on their offspring for independent living as a senior.

From robot-babysitters to AI teachers

This preference for robots is not only limited independent living. The study also found that 80 percent of parents believe that using AI will help their children learn from an early age – and even faster than in their generation.

Similarly, 74% of them would consider using a robot as a teacher for their children, and 40% of respondents said they would accept using a nanny robot to care for their children.

Much of the reason for this would simply come from the ease it would bring to everyday life. 45%, for example, agree that these technologies “minimize frustration as a parent,” while 64% think using it would give them “more time to do other things.” 63%, however, admit that this would affect the time they spend with their children.

Present in growth

Do you think they want to stop it? No way. 48% of parents said they would think about having a “pet robot” for their child. It is worth noting that mothers were more apprehensive about this decision, with only 42% being receptive to 55% for men.

Already at the time of having your child drive a car for the first time against the fear of leaving them in an autonomous car, we have a close ratio: 31% of them find the first option more worrisome, against only 25% in the second. The remaining 44% see both as something to fear.

Last but not least, the study also pointed out that the presence of these technologies has further influenced the type of choice that parents expect their children to follow. 74% of them said they would encourage them to an engineering career.

OpenAI Designs an AI-Based Algorithm That Allows a Robot to Mimic Tasks Performed by Humans

In December 2015, Elon Musk and some people and companies in the technology industry joined forces to announce the creation of OpenAI, a non-profit organization with the goal of making the results available worldwide Research in the field of artificial intelligence without requiring financial compensation.

At the time of its creation, the founders of the company explained that their researchers will be strongly encouraged to publish their work in the form of documents, blog posts, code, and patents (if any) World. A few years have now passed, and a few days ago, the company announced the availability of a new algorithm based on artificial intelligence.

OpenAI has announced the availability of a framework allowing robots to learn by imitating what they are given to see. Generally, for a system to be able to master the various facets of a task and run it without

In December 2015, Elon Musk and some people and companies in the technology industry joined forces to announce the creation of OpenAI, a non-profit organization with the goal of making the results available worldwide Research in the field of artificial intelligence without requiring financial compensation.

At the time of its creation, the founders of the company explained that their researchers will be strongly encouraged to publish their work in the form of documents, blog posts, code, and patents (if any) World. A few years have now passed, and a few days ago, the company announced the availability of a new algorithm based on artificial intelligence.

OpenAI has announced the availability of a framework allowing robots to learn by imitating what they are given to see. Generally, for a system to be able to master the various facets of a task and run it without problems, it requires learning tests on a broad range of samples. OpenAI, therefore, wanted to go even faster in learning by allowing robots to learn as human beings do.

This gave rise to the “one-shot imitation learning” framework. With this algorithm, a human can communicate to a robot how to perform a new task after executing it in a virtual reality environment. And from a single demonstration, the robot can perform the same task from an arbitrary initial configuration.

Thus one can construct a policy by learning imitation or reinforcement to stack blocks in towers of 3. But with this new algorithm, researchers have succeeded in designing policies that are not specific to a particular task, but rather can be used by a robot to know what to do in a new situation of a task.

In the above video, OpenAI has a demonstration of the formation of a policy that solves a different instance of the same task with as a learning data the simulation observed on another demonstration.

To stack the blocks, the robot uses an algorithm supported by two neural networks, namely a vision network and an imitation network. The vision array acquires the desired capabilities by recording hundreds of simulated images in a task with different lighting, texture, and object disturbances. The imitation network observes a demonstration, milking, reduces the trajectory of the moving objects and then accomplishes the intention starting with blocks arranged differently.

Below the imitation network, it has a process called “Soft Attention” that deals with both the different steps and actions as well as the appropriate blocks to be used in stacking and also the components of the vector specifying the locations of the various blocks in the environment.

The researchers explain that for the robot to learn a robust policy, a modest amount of noise has been introduced into the results of the script policy. This allowed the robot to perform its task properly even when things go wrong. Without the injection of this noise, the robot would not have been able to generalize what he learned by observing a specific task.

Finally, it should be noted that although the “one-shot imitation learning” algorithm was used to teach a robot to move blocks of colored cubes, it can also be used for other tasks.