REAP | Research Entrepreneurs Accelerating Prosperity

Filip Jadczak here, back again to share my experiences in my newest sprint here at the Felt Lab. This time I’ve been testing out a brand new piece of technology we’ve got here in the lab: the Meta 1.

The Meta is a set of augmented reality glasses — that is, they provide a layer of digital information on top of the real world. Unlike the Oculus Rift, which focuses on virtual reality and closes you off from the world completely, you can still see through the Meta and it simply adds onto what you see. It is comparable to Microsoft’s HoloLens, or Google Glass.

Meta first appeared on Kickstarter and got almost $200K in funding, surpassing its goal back in June of 2013. Since then the company has been working on their product. You can sign up to become one of their early pioneers, or a developer, on the Meta website.

Since this piece of equipment is so new, my job was to figure out how to get it set up and working in the lab, so others can try it out. We had a few bumps in the registration process, but after that the next step was to set up the software development kit (SDK) for the Meta. I ran into another issue here because I didn’t realize that the Meta software only runs on Windows 8, and is not compatible with Windows 7 (anyone interested in the Meta take note!) After that, however, I was able to get everything working and test out what the Meta can really do.

The Meta SDK comes with a few basic demos. First, you calibrate the glasses so they are adjusted well and can respond properly to your gestures. After that, you can run the demo which has a few basic features. The first demo generates a set of geometric objects which you can pick up with your hand and move throughout the space around you. The other demos feature a virtual car that separates into pieces, and exploding bubbles you can pop with your fingers!

Meta has two apps which can be downloaded from their website. The first creates a set of virtual objects that you can interact with by moving them around, resizing them etc. The second app is the first ever augmented reality web browser, which you can interact with in a similar way. The website claims that more apps are coming soon, and encourages users to create and submit their own. (The Meta runs on Unity, so anyone who is familiar with that software can create apps for the Meta.)

There are a few challenges that are currently present in the Meta. Firstly, it doesn’t always recognize when you are trying to grab one of the virtual objects — this is an issue I ran into several times which can sometimes be frustrating. Secondly, there isn’t a lot of content currently available for the Meta. I’m sure that with time, both of these issues will be resolved. Once the Meta finishes getting through its early stages, I think it will grow into a much better product.

Overall, the Meta is a pretty cool piece of technology. If you are ever in the lab on one of our open days, feel free to come and try it out! Augmented reality seems to be the way that things are going these days, so come out and get a glimpse of the future here at the Felt Lab.

Art-Mersion is an event that was created to explore new ways to experience a deeper understanding of a piece of art through technology. When we heard that Amy Ferrari was going to create a new exhibit called “Serene Yellow Spaces,” we gathered a team and started brainstorming what we could do to communicate beyond just the surface view of her paintings.

DSC_0066The team consisted of students, artists and experienced technology professionals. Our ideas ranged from mobile-based Augmented Reality concepts to a full-blown, interactive, Christie Digital Projection Mapping on the outside of the Button Factory Arts building. While we had our hearts set on running the interactive projection of Amy’s art onto the building, the cold March weather put that idea away for a future event.

Our mission was to design experiences that would create a deeper understanding of the art and enhance the theme of the new Serene Yellow Spaces exhibit.  We wanted to allow people to interact with the art and understand Amy’s creative process and the stories behind the paintings.

The result was an exciting exhibit that supplemented the exhibition’s opening celebration and extended it across 2 days (March 13th and 14th) at the Button Factory. Over 200 people attended the exhibit and in addition to meeting Amy and seeing her new exhibit, they were able to experiment with 8 different technology installations and hear Frank Chen perform his musical interpretation of the Serene Yellow Spaces exhibit. The visitors were excited and delighted with the diversity of the experiences and everyone came away from the event with an understanding of the evolution of the exhibit as a whole as well as having a chance to learn about the creative process from the initial sketches of the ideas to the final art piece.

SensingSereneWithin a week of opening, almost half of the paintings were already sold. Our hypothesis is that by having a deeper understanding of the background behind the painting and the artists personal thoughts, then people are more likely to developer an emotional bond with a work of art and appreciate it on a deeper level.

The 8 digital experiences included the following technology:

  • Oculus Rift Virtual Reality Art Gallery showing Amy’s paintings in a 3D environment
  • Aurasma Augmented Reality allowed people to view the video stories for each painting through mobile devices
  • GestureTek Cube provided hours of amusement for both young and old as they were able to interact with elements within the paintings
  • A slideshow tour of Amy’s technique and the background of the Serene Yellow Spaces exhibit
  • Interaxon’s MUSE Brain Sensor was used to allow people to see how their brain activity could control activity in “Amy’s Garden”. By seating them in front of a large 4×4 Christie MicroTile wall, they could feel immersed in a serene garden space and visually train their brain to calm down and relax
  • Content Interface Corporation’s touch screen display allowed people to explore Amy’s paintings and control which ones they wanted to learn more about
  • The MicroSoft Kinect for Windows sensor was used with Facial Recognition software to allow people to move from painting to painting in a large-scale 2D projection a virtual gallery installation. By moving their head, they were able to move the cursor around the large screen and by squinting their left eye, they could “click” on the arrows to move from painting to painting
  • Derivative’s TouchDesigner was integrated with the MicroSoft Kinect for Windows sensor to capture the viewer’s body movement to allow them to digitally mush the paint in Amy’s art by dancing with a palette of selected colours and custom music compositions

You can imagine the feelings of bliss and joy that were experienced when the visitors were allowed to play with the art! Many thanks to all the volunteers, Button Factory Arts and REAP for helping to make this event possible and for making it such a great success!

A-MTeam

This past Friday’s lunch and learn, we were lucky to have Ryan Van Stralen from Palette come talk to us about a new hands on control device for any software. The “Palette” is a hand controller which resembles that of a DJ panel, which was in part the inspiration when founder Calvin Chu, a University of Waterloo graduate, first designed it. What Calvin really wanted was a hands on device that is customizable in terms of function and design, so each Palette can do whatever you program it to do, which at the same time the Palettes can be attached together in any format you find desirable.

Palette was a Kickstarter project that was launched by Calvin and his team, whose goal was $100,000 and because of contributors, like REAP, they were able to over achieve their Kickstarter goal  with $158,000 raised in total. Check out the Kickstarter video here!

With condensed plugins, Palette becomes the central controller to all your software. It has a very simplified interface, which moves from simple touch to a more tactile feel when using its dials. After Palette completed its Kickstarter campaign, Calvin went to Adobe in 2014 and has since worked very closely in making Adobe suites a big market with Palette control. So imagine if you were a photographer who works with hundreds of photos doing the same thing, well with the Palette you can skip using a mouse and with the click of a button or turn of a dial that you have set with a specific function you now have polished photos. Palette almost makes you feel like a DJ for Photoshop.

Palette_Lunch_and_Learn

But don’t worry, you don’t have to be into Adobe to reap the benefits, Palette gives you access to hack your own software so that each control can be paired with whatever application or game you find yourself using the most. Palette is currently working on wireless controls so that you’ll be able to use your Palette pieces without connected with a wire to your computer, making it that much more accessible.

I think what we will see out of Palette will be the next generation of hardware, where not only will each computer have a keyboard and mouse, but will have a Palette which is included.

Palette_Attendees

I got to use the Palette on Friday, and considering I am fairly comfortable with Photoshop, I was able to transform a picture within minutes without the constant back and forth of the mouse trying to find the right setting and always having to go back to the same ones. Instead with the click of a button I was already using my favorite tool and with the turn of a dial had it the perfect pitch.

Palette is the future of short cuts.

Every person who has ever written an essay or had to do an APA or MLA style bibliography knows the hassles that come with trying to organize and track all of your online and offline sources. When Salman Heman joined us for a recent lunch and learn to talk about how Axiom can help alleviate this hassle, we were excited to learn all about it.

Axiom is an online service that allows users to organize and annotate a wide variety of digital documents (such as word documents, PDFs, PowerPoints, and even videos!) all while collaborating with others in real time.

Beyond just helping researchers and students, Axiom helps professors, professionals and collaborators work with their digital documents by allowing users to centralize their research documents easily. Rather than having to collect, annotate, and then sort physical documents which can be time consuming and difficult to manage, Axiom allows users to simply drag and drop their online documents into personalized digital bookshelves, which can be shelved and organized with ease. Axiom also allows their users to annotate digital books, notes, and videos, and even has an extension you can download onto your browser that allows you to track external webpages for your references.

Their system also renders various documents into a web format so that the user can view these documents directly in the browser without any plugins or extensions. Once this is done, users are able to cross reference across different file types which they cannot do with existing cloud based platforms, such as Google Drive and Dropbox. Another unique advantage of their platform is that it is publisher agnostic, unlike most platforms where publishers offer e-textbooks that are available only in their respective online platforms. This means the students have to manage and learn the various platforms, and as such, they don’t end up using any.

They also allow users to easily mark-up their documents as they would with a physical copy using their pen, highlighter & sticky notes options. Currently they are working on linking notes to their highlighting feature, but that will be offered in a future release.

They also have a virtual bookstore through their partnership with OpenStax College, where you can purchase digital textbooks for just $4.99. Their goal is to allow book and journal publishers access to an entirely new, centralized audience, while also keeping the cost of textbooks low for these audiences.

Additionally, they offer access to their annotation platform for free for life, and students only pay if they want to create a library of digital documents of more than 10 documents. Inviting and sharing with others is also easy, as all you have to do is type in their email address and send an invitation.

For more information, visit their website. Also, visit our Tumblr account for more pictures of the event!

Inspired by the art of Amy Ferrari, Amy’s Garden is an interactive visual installation that will be on display at Art-Mersion, March 13th-14th at Waterloo’s Button Factory Community Art Centre. I created the work using off-the-shelf web and 3D technologies, with a Christie Digital display and MUSE headband on loan from REAP.

I built up a lot of expertise in WebGL and Three.js freelancing on an ill-fated public sector project, and so had no way to showcase my work. At the time, Jennifer Janik of Deep Realities was assembling a collective of creatives for Art-Mersion, a kind of festival of virtual and interactive art that will feature an exhibition of Amy Ferrari’s paintings.

Amy’s style is all liquid, day-glow forms, and they naturally suggest shaded 3D models. I started to wonder: what would her world look like if it was in motion? What would the garden in her world look like?Blender Graphics

The undulating plants and animals are first modelled in Blender, a software more suited to creating Pixar-style movies. Then I transformed the models into a web-based environment that grows the garden interactively. A spectator looking at the artwork can put on the MUSE headband, a consumer-grade brainwave sensor made by Toronto’s InteraXon. The viewer’s mental activity is interpreted by software, telling the garden how to grow, whether to thrive or to wilt. I noted that Amy saturates her art with optimism, so having the emotional component to Amy’s Garden is important.

The Microtile display is as large as one of Amy’s paintings and very colourful, so it’s just as immersive. I hope people lose themselves in it as they would with a painting.

Serene Yellow Spaces by Amy Ferrari opens March 13th, 2015.  And please feel free to also check out my own website here.

Art-Mersion runs March 13th-14th at:
Waterloo Community Arts Centre
25 Regina Street S., Waterloo, Ontario

I hope to see you there!

Guest post by Filip Jadczak.

My name is Filip Jadczak, and this has been my first term working with REAP. I am a student at the University of Waterloo in the Global Business and Digital Arts program, so naturally the work I do in my classes fits in well with what is going on at REAP. I originally heard about the opportunities at the Felt Lab through another student in my program, so once things started back up after the summer I got involved as soon as I could.

My experiences at REAP have been invaluable. Beyond just the work experience that is offered, there are weekly lunch and learn sessions at the Felt Lab. I have been regularly attending these sessions and through them, I’ve connected with professionals, artists, and fellow technology enthusiasts in the field. I’ve learned about interesting topics and taken part in activities as well. Some memorable ones were the rapid prototyping workshop hosted by Boltmade, the thinkering session for the MaRS Innovation Centre, and the Structur3D 3D printing session.

Filip Thinkering Session

When I started at REAP in September, I was given my first project. At REAP projects are called ‘sprints’, as they are meant to be a quick test of a piece of technology for a certain purpose to see if it should be used in that way. My task was to install an app called Apptui onto the lab computers and devices, and customize it for the lab. Apptui allows the user to control a computer using a mobile device, such as a phone or tablet, in place of a mouse and keyboard. I found it fascinating as I had never heard of it before — but it was just the first of many remarkable pieces of technology I encountered in the Felt Lab!

After completing my first project, I filled out a report to explain what I did with the technology, what worked and what didn’t, and whether it should continue to be used in the lab. I also gave my own presentation alongside two other students at our own Felt Lab Friday session!

My second project at the lab, which I am currently still working on, is developing interactive apps for the various screens in the Felt Lab. I actually suggested this project myself, as I had noticed that while we do have lots of touch screens in the lab, people don’t always interact or play with them during the Felt Lab Fridays. I wanted to give visitors to the lab a chance to play with the technology instead of just looking at it. You can find my work on the MicroTile wall in the lab — look for an application called “Fridge Magnets!”

Overall I have greatly enjoyed working in the Felt Lab. It has given me the chance to encounter, examine and play with a variety of devices I would not normally have access to otherwise. This, along with my course work at the university, is constantly inspiring new ideas in my mind. There’s so much I want to do, and the Felt Lab is just the beginning. It has become a great launch pad into a future of possibilities for me, and I am very grateful for the opportunity to be part of the team here.

You can always find me in the lab on Fridays. I hope you come to our Lunch and Learns and become part of our community! See you there!

Reposted from The ObserverXtra. Written by Scott Barber.

Artists and tech experts from across the region will converge on St. Jacobs this spring to create an innovative series of projects designed to explore the village’s culture and heritage.

The Felt Lab, a community-based technology hub located in the Quarry building at 140 King St. in St. Jacobs, will host a series of collaborative art workshops through the spring and summer run by Isabella Stefanescu of the Inter Arts Matrix.

The Felt Lab, a community-based technology hub located in the Quarry building at 1440* King St. in St. Jacobs, will host a series of collaborative art workshops through the spring and summer run by Isabella Stefanescu of the Inter Arts Matrix.

With a $12,000 Waterloo Arts Fund grant and a passion for local arts, Isabella Stefanescu, an artistic director with the InterArts Matrix and researchers with the University of Waterloo’s Felt Lab – a tech incubator located in St. Jacobs – have teamed up to connect the art and tech worlds.

The idea, bring together talented and creative individuals from a variety of disciplines and fields, immerse them within a specific cultural context – in this case, right here in Woolwich Township – and give them the freedom to create together.

What the final product will look like is up in the air – from films to sculptures, painting and plays, it’ll be up to the artists to determine.
What is clear is the project’s focus.

“As artists we have a responsibility to create art that has deep roots here in the region,” Stefanescu explained. “What makes us unique is the place we live. Otherwise we are just like everyone else around the world; our community defines us.”

So much of the arts scene here is imported from other regions and countries, Stefanescu added. That’s fine, but we also need locally produced art to examine who we are and to spur the discourse on what it means to live in the Waterloo Region.

And it’s a fascinating topic.

“The artists themselves will have to start thinking about the place they find themselves,” Stefanescu said. “We will be working right there in the village (of St. Jacobs, at the Felt Lab studio on King Street) and we will be thinking about its history and really looking at the place and its natural features as they have been transformed by the presence of humans.”

They’ll ask themselves: “What is it saying to you or what do you want to say to it? And so the artists will almost enter into a dialogue with the place.”

She continued, “We know how to make a home out of a house, but how do we make a home out of a town, a village or a region, and do we feel like we somehow belong? I think engaging the artists and asking some of those questions and coming up with a variety of answers is something that all of us who live and work here really need.”

University of Waterloo drama professor and Felt Lab executive Jill Tomasson Goodwin concurred.

“We are very excited to be supporting Isabella’s latest project, which will utilize a variety of supports that we extend to artists,” she said. “She will be using (the Felt Lab) space and the technology to invite artists to play and work in the space over this next year.”

Artists bring a unique perspective to the tech field, which is why the Felt Lab and its parent organization – Research Entrepreneurs Accelerating Prosperity (REAP) – encourage collaboration between the two industries.

Sponsored by REAP (Research Entrepreneurs Accelerating Prosperity), Quarry Communications and Christie Digital Systems, the Felt Lab facility provides a unique setting for artistic and innovative collaboration.

Sponsored by REAP (Research Entrepreneurs Accelerating Prosperity), Quarry Communications and Christie Digital Systems, the Felt Lab facility provides a unique setting for artistic and innovative collaboration.

“Artists really take a technology and they push its technical capabilities to work through new ways to express new ideas artistically,” she explained. “On the other side, technologists get to see how out of the box thinkers use their technologies in innovative ways, which are often ways that they may not have even thought of before. That often sparks innovation, based on the artistic insights, uses and applications.”

A wide range of professional and student artists will descend upon downtown St. Jacobs this spring, using the Felt Lab studio to create shows which are expected to be performed for public audiences in the summer and fall.

For more information check out interartsmatrix.com.

Guest Post by Tina Wang

I was working on a project that re-creates an experience of Augmented Acoustic. My task was to create a music visualization environment, a prototype that can be visualized live through sounds from external environments. As I was new to the interactive prototyping tool, I was recommended to use Quartz Composer to create my first prototype. During my experience of working with the project, I have discovered some interesting things about Quartz Composer and decided to put together what I have learned into a blog post.

Quartz ComposerWhat is Quartz Composer?
Quartz Composer is a node based visual programming language for OSX that lets you create motion graphics, live visualizations, and interactive prototypes. You can create a composition by connecting a number of patches with connectors (“noodles”) that sends a signal from one patch to another. One of the things I like about Quartz Composer is that you can simply connect the patches rather than writing code, and the results comes out almost instantaneously. The application provides a library of patches that you can select for your design, and you can even customize your own patch. Some basic patches include transitions, color mixer, and plugins for audio or video inputs, etc. Once you use those patches in the Editor, you can immediately see your composition results in the Viewer window. I found it rewarding to be able to examine my work immediately; you can even make changes while working with it. This quick result feature makes Quartz Composer an excellent and user friendly tool for designers in creating interactive prototypes.

Once you get the hang of it, Quartz Composer is not that complicated to work with. However, all tools have their limitations. As for non-Mac users, the application is a bit disappointing since it only supports OSX and IOS platforms. Another challenge was in testing since there isn’t much documentation and recourse out there to make it easy to work with.

The Vision of Interactive Prototyping
Quartz Composer has a lot of potential for creating interactive prototypes. In the industry, Facebook Home was one of the interactive mobile apps that was created with Quartz Composer. The company created a free toolkit, Origami, that provides additional patches for people to use for their designs. As I mentioned earlier you can create your own patch, as it is possible that whenever you think of a hardware or tool you want to design with, Quartz Composer can make that happen. In the Felt Lab, we have hardware such as a contact mic, Leap Motion, Emotiv, and many others that can be used with Quartz Composer. Since Quartz Composer has a wide range of useful features and creative options, it is a powerful tool that can be implemented in many ways, from 2D mobile interface designs to 3D large scale interaction displays. The possibility of future prototyping for better products and interactive arts will combine these tools to make effective and attractive interactions for users. That is the beauty of integrating arts and technology.

TinaBreaking into Learning Curve
Not many designers may know of or have heard of Quartz Composer, possibly because of its limitations. The lack of resources also translates into a smaller base of users. On the bright side, the Quartz Composer’s user guide is great in helping new users understand the concept behind the application. On the downside, the application itself is kind of buggy; meaning occasionally, it freezes or crashes. Despite the down side of the application, Quartz Composer is still a great design tool to get your hands on because you never know how it could help you in your design if you don’t try it.

If you are interested to work with Quartz Composer, you are more than welcome to use my music visualization prototype at the Felt Lab, and build up on it!

It is a new year, and while it is always fun to make resolutions and plan ahead for the year to come, it is also a time of reflection on the year that has passed, and boy was 2014 a busy year for the lab.  Not only did we sponsor and speak at various industry events, such as the annual UX conference Fluxible, but we also hosted several ‘thinkering’ sessions for various businesses and artists. In fact, we spent a great deal of time supporting various arts endeavors not only throughout the region but in Toronto and British Columbia as well.

 

We were joined in September by Carey Dodge, Canada’s only Technology Director in the Canadian theatre scene, which is completely different from the traditional Technical Director. Carey comes from a technical background, originally in electroacoustics, and has his MA in sonic arts from Queen’s University Belfast. His professional focus is to integrate technology into performances by Boca Del Lupo, the theatre company he works at with his brother.  Carey visited the lab for a week to work with our projection mapping software and the new version of the Microsoft Kinect. He was also generous enough to share his experiences with local theatre practitioners through Pat the Dog while he was here.

 

REAP also had the pleasure of sponsoring an art installation at Nuit Blanche by a group of Architecture students from the University of Waterloo.  The students put together a massive installation with only a week to build and thousands of kilometers between them as they were all working at various co-op placements around the globe when they first approached us for support. They managed to pull it off and built the 24 foot wall holding 2,000 tubes with 4,000 lights strung along it to spectacular effect.

 

Locally, we helped to support artists who were involved in Night/Shift, K-W’s own version of Nuit Blanche, which is hosted every November in the heart of downtown Kitchener. Our technology made appearances in various installations and performances. Our newest versions of the Microsoft Kinect was used in Samarian Woman 28 arts collective’s installation in the City Hall Rotunda, and our Christie Digital projector was used by the 12 Angry Filmmakers during their filming at the Walper Terrace Hotel.

While we could list several more examples, I think the most important thing to note is that there is a thriving arts scene in Kitchener-Waterloo, filled with vibrant minds who are willing to push the boundaries between arts and technology, and REAP is as happy to support these people and organizations in the year to come as we were last year.

If you have an arts project and you want to incorporate our technology, or if you need space, shoot us an email at reap.uw[at]gmail[dot]com. We’d be happy to hear from you!

For the past six months, REAP has been running a Meetup group on Augmented Reality and Virtual Reality. We’ve had a number of interesting presentations, ranging from the Myo armband (presented by one of its developers!), an overview of various software packages for creating avatars, a look back at the Kinect Hackathon and an in-depth examination of the Virtual Human Toolkit. Not a bad start!

We’ve also used the Meetup to brainstorm on some creative ways of bringing art and technology together, with Amy Ferrari’s artwork as a key focus.

The next six months are going to be an exciting time for the field of Virtual Reality. Several important Kickstarter campaigns expect to be delivering hardware, and as it becomes available we plan to do some hands-on sessions so people can become familiar with the capabilities of various devices.

The mounting kit for the LeapMotion controller has arrived and is now installed on our DK2 Rift. One of our upcoming meetups will show how having your hands and fingers with you in the virtual environment can greatly enhance your sense of presence.

The lab should soon be receiving its Sixense STEM tracker system. The STEM is a set of five wireless 6 DOF (degree-of-freedom) trackers that report their spatial position and orientation. The STEM can be used to track the movement of body parts (head, hands, feet) as a user moves freely in physical space, so that those same movements can be applied to their avatar.

Also coming out soon is the Perception Neuron system, which uses IMUs (Inertial Measurement Units) to do even more complete body tracking. One competitor to the Neuron, the PrioVR suit, should also be shipping in the next few months.

The Cyberith Virtualizer is a circular, omni-directional treadmill that allows users to move around a virtual world by walking instead of using a joystick. The Virtualizer is expected to ship in the next six months, along with at least one competitor (the Virtuix Omni). We hope to get one into the lab to demonstrate at a future Meetup.

And of course, the consumer version of the Oculus Rift is expected next year as well, along with a number of competitors.

Beyond just showing off hardware, we plan to have in-depth sessions on things like immersive cinema, multi-user technology and more.

Our big goal for the next six months is to increase our numbers. While we do have a small, enthusiastic core group of attendees, we’d like to get even more people out to explore and experience Augmented and Virtual Reality.

If you’re interested in finding out more, just join the Waterloo AR/VR Meetup group and you’ll receive notices of our upcoming events.