Calling students and contributors: list of available code and more developments for our 2018 projects - Liquid Galaxy project community site

Calling students and contributors: list of available code and more developments for our 2018 projects

During this summer 2018 and the upcoming first semester of 2018/2019 we're focused on two developments.
This is a call for our students and mentors to collaborate on their development.

The projects tasks coordination will be handled by Deniz Yüksel and Andreu Ibanez, address them for extended information.



Our team of students, residents and visitors, are working hard on the development of the LGxEDU applications. Liquid Galaxy for Chromebooks installation is 90% developed (with the help of Oscar Carrasco), and Android controller application is on his way.

Here we want to offer some task related to the integration of the Google Assistant technology in the Liquid Galaxy system.

- Develop a library to connect the Google Assistant, in his Google home form factor, to the Liquid Galaxy.
This will consist on a DialogFlow and a webhook that will connect directly to a master computer on a Liquid Galaxy system.

Initially the functionalities we want to see are two:
    . Implement voice commands to move the Liquid Galaxy to
      places. IE: "Ok Google, go to Izmir, Turkey", for the LG to
      move to Izmir in Turkey.

    . Implement voice commands to move the camera. IE:
      "OK Google, move forward" to have the camera moved fwd.
      "OK Google, turn right 90 degrees"
      or similar actions to implement all the usual movements a
      visitor to a LG installation will do with a Space Navigator
      joystick.

You can see this in action on this video, from student Marcel Porta, showing up the demo he created in 2017 to launch projects on the Liquid Galaxy from a Google Home.



In the near future we also want to develop the Google Assistant functionality in the LGxEDU Android app, for launching and playing with the quizes. You can ask for the mockup we've on development to start with this.

In this Moqup you can take a look at the actual screens of the LGxEDU application under development.





For the first semester we want to start the work on project Dronecoria, were we'll help creator Lot Amorós with some stuff we can develop under our actual knowledge:

- Create a demo application for the Liquid Galaxy to show up the drone routes of the Dronecoria projects. 
This will be a demo, but our goal is develop it for real. To make the demo with hardcoded kmls, we'll reuse the code we already developed on this projects:

You can see here his videos talking about his developments. 




- Implementing a TensorFlow model using AIY Vision kit that will define what areas of a burned forest will be ok to reforest with Dronecoria. The idea is that a drone will make a first flight on the area, taking pictures with an onboard camera, and doing the classification of what areas are good to seed again. This will be decided on the ML model with some parameters still to be decided, maybe colours of the area and more.



The analysis will be done on the drone itself with the vision kit mounted on, or getting the images from the camera on the drone  and processing them in a local pc with TensorFlow, or in the cloud.


Our Team

Great people make great work. Meet the team.
'

PARTNERS

We are proud to work with some of the best partners.