Ryan can list the first 151 Pokémon all in order off by heart – a feat he calls his ‘party trick’ despite being such an introvert that he’d never be found anywhere near a party. He’d much rather just have a night in with Mario Kart and a pizza, and we can’t say we blame him.
The growing popularity of 3D printing machines and companies like Thingiverse and Shapeways have given previously unimaginable powers to makers, enabling them to create everything from cosplay accessories to replacement parts. But even though 3D printing has created a new world of customized objects, most of us are still buying clothes off the rack. Now researchers at MIT are working on software that will allow anyone to customize or design their own knitwear, even if they have never picked up a ball of yarn.
A team of researchers at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), led by computer scientist Alexandre Kaspar, released two new papers describing the software today. One is about a system called InverseKnit that automatically creates patterns from photos of knitted items. The other one introduces new design software, called CADKnit, that allows people with no knitting or design experience to quickly customize templates, adjusting the size, final shape and decorative details (like the gloves shown below).
The final patterns can be used with a knitting machine, which have been available to home knitters for years, but still require a fair amount of technical knowledge in order to design patterns for.
Gloves made using CADknit
Both CADKnit and InverseKnit want to make designing and making machine-knitted garments as accessible as 3D printing is now. Once the software is commercialized, Kaspar envisions “knitting as a service” for consumers who want to order customized garments. It can also enable clothing designers to spend less time learning how to write knitwear patterns for machines and reduce waste in the prototyping and manufacturing process. Another target audience for the software are hand-knitters who want to try a new way of working with yarn.
“If you think about it like 3D printing, a lot of people have been using 3D printers or hacking 3D printers, so they are great potential users for our system, because they can do that with knitting,” says Kaspar.
One potential partner for CADKnit and InverseKnit is Kniterate, a company that makes a digital knitting machine for hobbyists, makerspaces and small businesses. Kaspar says he has been talking to Kniterate’s team about making knitwear customization more accessible.
CADKnit combines 2D images with CAD and photo-editing software to create customizable templates. It was tested with knitting newbies, who despite having little machine knitting experience were still able to create relatively complex garments, like gloves, and effects including lace motifs and color patterns.
To develop InverseKnit, researchers first created a dataset of knitting patterns with matching images that were used to train a deep neural network to generate machine knitting patterns. The team says that during InverseKnit’s testing, the system produced accurate instructions 94% of the time. There is still work to do before InverseKnit can be commercialized. For example, the machine was tested using one specific type of acrylic yarn, so it needs to be trained to work with other fibers.
“3D printing took a while before people were comfortable enough to think they could do something with it,” says Kaspar. “It will be the same thing with what we do.”
Ninja, the former Twitch superstar, has left the popular streaming service and is now only on Mixer. This leaves a gap for someone else to jump into Twitch streaming and become the next Ninja! Let’s figure out who that next famous streamer might be.
Your challenge this week: Bring some fresh faces to Twitch!
Anyone from gaming, TV, movies or comics is fair game. Who will be the next megastar on Twitch? Hank Hill? Maybe. And yes, I spent too much time adding some King Of The Hill references to the image above.
To help you folks out, here are a few overlays I made and recolored. Or grab any from the internet or make your own! It’s up to you.
Advertisement
Next week I’ll pick a winner and hand out some awards!
Please note that the image submissions rules have changed just a little bit. We’re looking for images that are 800 pixels wide now!
How To Submit — Instructions
1. Create your ‘Shop and save it to your desktop. Images must be at least 800 pixels wide.
2. Go to the bottom of this post
3. This brings up a comment window. Click “Choose file” if you’re uploading your ‘shop from your desktop
4. Alternatively, you can upload the ‘Shop to a free image hosting service. I suggest imgur. Then paste the image’s URL into the field that says “Image URL.” Note: this must be the URL of the image itself, not the page where it is displayed. That means the URL ends in .jpg, .gif, .png, whatever.
5. Add editorial commentary and hit submit and your image will load. If it doesn’t, upload the image to imgur and paste the image URL as a comment. I will look at it.
6. Large-size images may not upload properly, though we have seen some animated .gifs upwards of 5 MB. If you’re still having trouble uploading the image, try to keep its longest dimension (horizontal or vertical) under 1000 pixels, or the whole thing under 2 MB.
Over the past months, NASA's Parker Solar Probe flew closer to the sun than any other spacecraft before it -- not once, but twice on two flybys. The probe obviously collected as much data as it could so that we can understand the sun better. Now its mission team at Johns Hopkins Applied Physics Laboratory in Maryland has just received the final transmission for the 22 gigabytes of science data collected during those two encounters. That's 50 percent more than it expected to receive by now, all thanks to the spacecraft's telecommunications system performing better than expected.
Parker's ground team found out soon after launch that the probe is capable of a higher downlink rate. In fact, they're taking advantage of that ability by instructing the probe to send back even more data from the second encounter in April. During that event, the spacecraft's four suites of science instruments kept busy collecting information. That's why the mission team is expecting to receive an additional 25GB of science data between July 24th and August 15th.
The mission team will release the data from the first two encounters to the public later this year. Before that happens, the spacecraft will conduct its third flyby, which will start on August 27th and reach closest approach on September 1st. Researchers are hoping that over the net few years the mission can gather the information we need to unravel some of the sun's biggest mysteries, including why the sun's corona (its aura of plasma) is far hotter than its visible surface.
Over the past months, NASA's Parker Solar Probe flew closer to the sun than any other spacecraft before it -- not once, but twice on two flybys. The probe obviously collected as much data as it could so that we can understand the sun better. Now its mission team at Johns Hopkins Applied Physics Laboratory in Maryland has just received the final transmission for the 22 gigabytes of science data collected during those two encounters. That's 50 percent more than it expected to receive by now, all thanks to the spacecraft's telecommunications system performing better than expected.
Parker's ground team found out soon after launch that the probe is capable of a higher downlink rate. In fact, they're taking advantage of that ability by instructing the probe to send back even more data from the second encounter in April. During that event, the spacecraft's four suites of science instruments kept busy collecting information. That's why the mission team is expecting to receive an additional 25GB of science data between July 24th and August 15th.
The mission team will release the data from the first two encounters to the public later this year. Before that happens, the spacecraft will conduct its third flyby, which will start on August 27th and reach closest approach on September 1st. Researchers are hoping that over the net few years the mission can gather the information we need to unravel some of the sun's biggest mysteries, including why the sun's corona (its aura of plasma) is far hotter than its visible surface.
There have been privacy concerns about digital assistants for just about as long as there have been digital assistants, and the recent confirmation that Google and Apple were listening to Assistant and Siri conversations has done nothing to allay fears.
The 'were' in that last sentence is important, as both companies have agreed -- at least temporarily -- to cease the practice. Not wanting to miss out on an opportunity for good PR, Amazon is getting in on the action, giving Alexa users the chance to opt out of having their conversations with its digital assistant listened to -- or "manually reviewed", as Amazon would prefer. Here's how to do just that.
See also:
A new setting has been implemented that gives users the opportunity to indicate that they'd really rather actual human beings didn't eavesdrop (er... review) on the things they say to Alexa. Amazon warns that opting out of manually reviewing could mean that "voice recognition and new features may not work well for you", but it has bowed to pressure in introducing the opt-out anyway.
We take customer privacy seriously and continuously review our practices and procedures. For Alexa, we already offer customers the ability to opt-out of having their voice recordings used to help develop new Alexa features. The voice recordings from customers who use this opt-out are also excluded from our supervised learning workflows that involve manual review of an extremely small sample of Alexa requests. We'll also be updating information we provide to customers to make our practices more clear.
If you would like to opt out, you can head to amazon.com/alexaprivacysettings. You can also opt out using the Alexa app by opening Settings and going to Alexa Privacy followed by Manage How Your Data Improves Alexa. Flick the "Help Improve Amazon Services and Develop New Features" toggle to the off position.
A paper posted online this month has settled a nearly 30-year-old conjecture about the structure of the fundamental building blocks of computer circuits.