Livestream Day 1: Stage 3 (Google I/O ’18)

I got it the Sun comes out and it’s a new day I’m gonna make it on my let’s build a house we’ll have a backyard something beautiful something with just for me say welcome thank you for joining our session will begin soon No alright alright hello everyone how are you doing okay good I hope there there is still some more people take something out of this talk and build something amazing afterwards before I start I want to get a little bit of a feel who am I talking to so whoever view here is is a developer alright that’s amazing you’re in the right place who is an Android developer we’re stunned something with firebase all right a lot back and back-end developers anyone who develops with calling awesome awesome cool so my favorite languages are golang and Java so I gonna stick to Java and this talk to reach some the most or the biggest audience basically so let’s get started so if we choose at the right storage it is is actually really really a big problem this is really hard to do so we have a lot of things to consider when you choose your storage system so things like what’s the object size what’s the scale that you need what are the latency requirements throughput consistencies and like permissions things like that so there are a lot of things that you need to consider when choosing the right storage system now when I created this talk by thinking like what am I going to talk about or how do I gonna construct this talk so I was wondering if I should start with something oh I have to get it out start like 20 years ago pretty much right where you had like storage systems which are mostly local well things like that who still remember this right I actually even started with the five and a quarter soul and diskettes but they are pretty amazing but I thought like I’m not gonna bore you with like history like 20 years ago even so it’s amazing it’s really interesting and everybody should look into that but we are living in the now and almost in the future so I gotta talk a little bit about the things that you can do today ok so how can I do that like I didn’t want to go through like a try talk in the sense of like talking you through the different options that you have like with firebase or with Google cloud platform I wanted to make that a little bit more entertaining so I thought about let’s let’s do this and create an sample app okay so here’s my my startup idea I’m a cloud developer advocate so for me everything is about the cloud so my startup idea is cloudy dot pictures so concentrate on one thing and do one thing well all right this is where I wanna get to by the end of the talk I want to have an app where you can upload pictures right where you can crop and resize the picture where you can store metadata and then of course display the date the pictures right so what do we have in terms of options now I saw a couple of you or firebase developers for the ones that haven’t been for a touched or have touched firebase so far I really recommend to look into that firebase gives you a lot of things a lot of tools to build applications especially mobile applications be it like Android iOS or web application right you have things like authentication hosting firestore which gives you a flexible data storage for for metadata you have cloud storage for firebase and you have even something that can execute your site code called cloud functions for firebase now there might be situations where where you have to get out of this where you want to do a little bit more sophisticated tasks the things that you can’t cover with cloud functions there are limitations with cloud functions and there are reasons to to break out of this ecosystem so that’s where Google cloud platform comes in weave Google cloud platform you have a multitude of tools at your disposal that you can use on one hand you have compute from like be our VMs where you can do pretty much everything that you want almost up to cloud functions as well right so you have a multitude of compute that you can use to do your things we also have a multitude of storage products under the Google cloud platform portfolio some of these products are actually the exact same products that you can use with firebase so you can think of that you have different interfaces to the same product which makes it actually easier to deal with them force if different use cases so if you build a web app it’s easier to use the firebase SDK if you built a back-end system you can use like our client libraries for on the Google cloud platform all right so the first thing when we upload a picture we need a place to store our pictures so that’s where Google Cloud Storage comes in or cloud storage for firebase asset they’re both the same product the product is actually a highly available durable simple and secured object storage what does it mean what is an object source so object storage is pretty much every kind of block that you can think of a plop is basically bytes right so it can be images it can be videos it can be normal files it can pretty much be any byte stream that you can find now cloud storage is optimized for for user latency for end user latency there are a lot of things that you can do you can define the caching strategies things like that you can make your object storage public so that people outside can access you can actually make it in the sense that other people pay for the access to your cloud storage system it is integrated with what our over 100 end points or cloud puffs that we get also basically if somebody comes in from let’s say Asia or Europe they go to the closest pop 4 which is connected to their Internet service provider and then they go directly into our Google Network and that means the latency to retrieve objects or even store objects are live minimized so it’s a really good tool for our app that’s exactly what we want any right so we’re gonna start with creating a packet so a packet is a unique root endpoint so to say that you have to create to be able to store some some data when you create a bucket you define things like the storage class and the location where your pocket lives so in this case as you can see here I’m I have two examples either you use our seal I where G is util you make a bucket you define the storage class in this case it’s multi-regional and then my storage is cloudy to pick cloudy – pictures you can do the same with Java if you want to so if you want to create buckets polka matically you can use our storage API s– and then basically directly created from your java program alright next step is uploading a file again very easy if you’re familiar with CP on any kind of operating system you’re just prepend it with cheese you told SCM which allows you to actually multicast your your upload and then you upload a picture here to our cloud storage bucket in Java pretty much simple and straightforward as well you select the file or you point to a file you make a byte stream or binary out of it you put it in the bucket and create created with the name as you can see here in this example I actually also gave it the content type that you can do as well if you want to do it from web for the ones that have done firebase this looks familiar so basically you use the firebase rep SDKs you get the storage reference you use a file reader you get you from your input field you get the file name and the content and you upload that to storage very straightforward that we use as well for our app a next last but not least for storage is how to read a file and again it’s very straightforward you can do a copy like download it you can use Java the plop object and read it from the from your pocket or on the web you basically get a download URL and then you use this download download URL and put it in for instance your image tag or your background image CSS attribute all right so that’s great we uploaded all our pictures or we uploaded one picture now how can I do processing right if I upload a picture from my phone or from like from my camera these pictures are large they are like megabytes and I don’t want to serve megabytes of pictures always to to my webpage or to my users Edward train the bandwidth a lot so what I want to do is I want to actually create some some nails now there’s a possibility that you create some nails already on your clients but I don’t think that is a good idea there are many reasons for not doing this now I can create some nails on the back end which I going to use and I could trigger the some decoration from my clients I don’t think that is a good idea either what happens if you upload a picture and then you get disconnected or the some new creation fails and you need to retry it all this logic would be cluttering your your front end and your your client code you want to do these things on the back end so I want to use a different mechanism to actually triggering the processing of my image files and that’s where object notifications come in so object notifications on Google Cloud Storage allows you to basically register a hook which gets called if anything happens on your storage system so there are four event types and you can hook your systems up to its ISO finalized which is basically object.create it’s an object delete it is archiving of an object and it’s any kind of metadata update that you do on your objects so you can hook into any of these events and there’s different mechanisms to hook into these events there’s reamain mechanisms that you can use on one end if you use firebase and cloud functions you can use the cloud storage trigger for clog functions so basically if you submit your function you can say these functions get triggered if anything happens on my bucket this is actually an integrated with the cloud pub/sub notification system so flower pops up notification also allows you to set a notification watch command on your on your bucket and say if anything happens of these events that I declare please send me a pops up message and then you can have a system that deals with the pops up message and and they’re kind of the legacy object change notifications that are still around this – what we introduced fairly early on in the Google Cloud storage platform a storage product is the object change notification so if you register an object change notification you basically get an HTTP call an HTTP endpoint and then you can deal with the message there as well the nice thing about this is all this what you’re using here if you set this up it’s integrated in a way that it does all the retry so if you don’t acknowledge a successful processing of the message the system just retries it automatically up to 30 days right so how do I set up a cloud pops up trigger here’s an example I using the G cloud pops up G Cloud SDK or COI so it would G cloud pub/sub topics create notice I the player a notification I create here in this case a notification against my cloudy pictures bucket and I say I only want to have notifications when object finalize is happening so when an object is graded and then I declare the public where this notification is going to be sent to and then if I want to have a system that receives this messages I create a so-called subscription here at G cloud pops up subscription grade pops up and not if not if sup which basically hooks to the topic that I just created all right so let me show you how that looks like really quick so I want to show you here you see my application right now it’s pretty empty if i refresh this nothing is happening so throughout this talk I actually want to fill this with pictures so I can assure you if I upload a picture here I just select one picture i canta to upload and it’s gonna upload but pretty much nothing else gonna happen alright so if I gonna refresh my page this picture is actually gonna be gone so I need to implement all the things behind the scenes to process this picture so the first thing that we’re gonna do is we kind of I show you how to create this let me make that a little bigger can you see this thumbs up thumbs down if you can see that alright cool so first thing is G cloud pops up and then I so it’s a topic weight and let’s say to know if sup oh that’s right all right so I create my not’ve topic all right perfect that happened the next thing that I gonna do is I want to actually create my my watch command so now if you have spotted I actually named the topic differently so I gonna remove this here and create this notification so now what is going to happen is actually I’m creating a notice and a notification watch on my my storage bucket you see here the storage is named a little different because that is the storage which I actually use for my application now the next thing that I want to do is I would actually want to see if any messages are coming in so I can do notification pops up sorry she pops up notification come on oh sorry all right so it’s subscription I just created a subscription like I actually need to create a subscription I haven’t done that so I want to create a subscription which I call uploads create a subscription so now I have a subscription that actually listen listen on that topic this is a little slows and let me turn off the Wi-Fi and hope that it works a bit better without the Wi-Fi all right so now I want to actually listen to my pop subtopic so I gonna do pops a subscription and then I can pull and I gonna pull and not gonna acknowledge but I gonna pull on uploads right so as you can see right now there’s not nothing kinda happening see your items so let me upload a picture really quick again can upload that picture and then gonna go back to my system to pull uploads as you can see here I have them have a message alright in this message as you can see there’s a lot of things there are things like the ID the self link for it name the bucket generation ID all these things that you can use alright so next thing let’s go back to the slides so I’ve my my packet so now if I upload my pictures and processing them I want to actually store some data around this so even if I now process my pictures I’m not nothing gonna happen basically they’re not gonna be displayed in my web page so I need something to actually process them and and store some metadata or anthem so that’s where we go back and look at like what is a portfolio of systems that we have available for us and here I gonna look at firestorm fire store is a no SQL back-end system which allows you to store document based data and that the advantage of that is actually that’s very flexible in a sense so if you want to add some some fields or things that I said you can do that that throughout the life cycle of the application so you don’t have to be restrictive on the schema that you were gonna use you can just evolve the schema value at develop your application now for firestore again they are at line libraries that you can use from the web as well as from your back-end systems so cloud fire stores just to summarize again it’s very flexible you have an expressive great languaging query language you get real-time updates if you use firebase you can basically use watch commands or basically get events for any data that has changed you have offline support if you do a web application and it’s designed to scale again so this is a system that can scale to massive sizes so how do you write data to firestore here’s an example for Java so pretty much everything in fire store is a sink so as you can see here you get the document ref you give it a name to your document basically in an ID we have a collection document and then you put your fields in there to write this is you use an API future and then you can also update fields if you want to now if you read data it’s pretty much similar to that there are not a lot of changes here you just do on the document ref that you create you just do a get and that block still you have the data all right so let’s see how we can integrate that to the actually write some data in our application so now we’re going to switch to our to some code so what I have here is my image processing service so whatever here is I basically set up my application with Google credentials so this uses the default credentials that you have set either if you run that on a GC instance compute engine instance then you have it default context or you provide a service account which is used to authenticate you it’s done very easily you really need only this part like six lines to get the storage Condor – you get the credential context now then we need a project ID we define our storage that we want to use so this is again cloud storage for firebase I define here my pops up that I have defined and I can actually use this and say okay I created a pops up which is uploads so I can use the app the uploads subscription that I created earlier alright and then this code what you see here is basically listening for pops up messages so that’s again doing this asynchronously every time I get a message I have here my message that I get I gonna put that into my process message function which we going to look into in a bit and then if the function is actually successful I acknowledge this message now what you have if pops up is if you can actually have multiple subscribers on the same subscription and what happens is your message get round-robin distributed to all the subscribers on one subscription now if you want to make sure that one message is distributed to multiple places and always to multiple places then you just create multiple subscriptions alright so let’s look into our process message so what are we going to do here is basically I gonna check for uploads on my on my image when it gets uploaded and I kind of show you really quick we have here our storage system if I go to storage you’re gonna see here are some images there are the two images that I uploaded earlier so let me yeah alright and then I basically take the pass take the user ID out of this take the file ID out of it do some resizing I’m doing the resizing here with imagemagick so that’s the reason I wanted to run this on a cheesy instance for instance I store these sums in my pocket as you can see here in the create function I have like a 600 pixel wide thumbnail and a 1600 X of white some nail and then I just create and move this object as well to a different folder that it’s not in my uploads folder anymore and last but not least I just clean up all my my temporary files that I had had in between so now as you can see I just started that so let’s see what’s gonna happen since we uploaded some pictures earlier on this on this subscription it should actually get these as you can see now it actually gets these messages and that’s all the processing right so let’s have a look how that looks like so we see here that we have our messages you can say on my storage you can see if anything is happening okay we have actually uploads and RAW images now so we have images moving great amazing and then let’s see if any metadata is coming in Natal a is not coming in all right so let’s just upload another picture and see if our metadata is coming in actually I’m not writing any metadata so I actually need to write my metadata first so we need to look into setting this up for writing some metadata so the first thing is I want to I have my fire story so fire story in it so this creates takes the credentials again and initializes my firebase app and my fire store and the next is before I move my data to the raw folder actually let’s do it by and laughs after that I want to update my my fire store metadata so here I’m just going to add three fields the process feels there’s some available field and a large available field all right they’re gonna save this on a restored my application something is missing all right fire verse is not there fire store all right of course we need actually to provide this here so firestorm and then here I want to provide my firestorm all right let’s try this again alright so now it is working so let’s have a look if we upload a picture okay you upload a picture so this is actually just a client site that I display this picture so if i refresh icon you see nothing that’s okay I can’t see nothing because it’s basically downloading the file doing the image metric and then uploading the file again so that will take a little while so we see a new element coming in let’s see if we get all the stuff all right okay perfect so we have now the metadata there which means our picture was processed and first success okay awesome so we have the first picture there let’s switch back to the slides all right so we have our first mineral minimal Viable Product right so we can upload picture and I can display pictures amazing and I can store some metadata as well on it so now what if our app grows right like a lot of people thousands of people are starting to upload cloudy pictures I said before this is a very targeted app only for cloudy pictures so now what I want to do is I actually want to reduce some costs in the beginning you saw I created a bucket which is a multi-regional bucket so the motor in the original bucket is actually our like most sophisticated Sorge class that we have to reduce cost I have actually different options and storage classes that I can use there are four storage classes available under the cloud storage the multi-regional one is the one that is distributed across two regions which are at least a hundred miles apart from each other and gives you at the highest availability for your needs now if you do data processing and you want to actually process your data close to where it is living then we recommend to use a regional bucket regional packet is in one region is a logo cheaper and gives you a little bit less of an availability SLA now when it comes to archiving data we have a near line and a cold line storage class near line is pretty much if your images or any kind of objects or although they are not used within 30 days you can move them to near line then it’s cheaper actually to use near line then have them storage on a regional or multi-regional bucket now if you use your data like less than once a year then it’s recommended to remove these two cold line you still have the same like persistency of the of your of your data object and you still have the same API s which is amazing but the prices get get down now one thing which is really important to do all these things automated or in your application is something called object lifecycle if you object lifestyle what you can do is you can define rules when object get automatically transition between the storage classes so on my pocket I could define rule if an object is older than 30 days please move it to your line if it is older than like 365 days or although the 90 days please move it to too cold line well I also can do that per object I don’t have to create a bucket with a specific storage class you can do that but you don’t have to you can actually change the storage class per object so how does it look like you can use it either with gsutil and do it there or you can use a Java SCC there too to move your object so I’m gonna show that really quick how we gonna do this here so I have a second application here which extracts some more metadata so as you can see here the same like setup code pretty much but what I do is I’m reading the object and then I’m using the SDK to get like exif data out of it so now when this is done what I want to do is actually I want to move this object away so let’s see where we can do this so we can actually do like here and then move the object see how are we gonna do this yeah all right there is all right so basically I take my object I just define in your storage class built this and then say like a cave and my metadata is processed I just gonna move it to Nearline so I gonna start this here as well and it’s gonna listen on my notification as well so now this is a topic where notifications have to have been sent before so as you can see it starts actually processing as well and it might throw some errors figures I have deleted some object but it basically will process in the background our our objects so let’s have a look if that is actually working so what we can have here is like we have our objects and in time what you can see here is metadata is showing up all right so this gives me all the EXIF data which I now can use to search for incense or build a world map where I can put all my pictures and if I go to storage which I gonna do here I’m gonna go to my storage system gonna go to cloudy pictures I gotta go to my Ross gonna see here and as you can see my pictures actually now have the storage system near line so even though I created this fact that as a multi regional bucket I’m already moving my story and my objects into a different storage class which is really powerful all right switching back all right so we have that we have reduced our costs by moving our objects and saving some costs there the next thing that I want to do is enable a business model I wanna I want to generate some made some money with this right so the app has been conned my role I’ve stored a lot of pictures and now people have asked and and want to actually like print and frames or by frame and ship the pictures so I want to n able this now if I want to do this like I’m dealing with money and if I’m dealing with money I need actually a transactional system so I don’t want to do this with firestore because I think my transactional guarantee is that I have a fire store are not sufficient for what I want to do so that’s where another storage system comes in and we have a managed storage system which call for cloud SQL so cloud SQL is managed to be system where you can have MySQL or Postgres and that also gives you things like automatic updates you can do backups you can do failover so you can do fail overs into other people clouds you can do failover in to on-premise systems things like that it is vertical scalable so you can change your note size throughout your life cycle of your application and you have things like a 99.95 failsafe SLA so how do we create a MySQL instance again very straightforward cheat cloud SQL instance great you give it a name you say the tier which is basically the size that you want to use in the region where this cloud SQL instance is created next thing is I want to talk to my cloud SQL my SQL instance so what we provide is actually an a proxy which automatically authenticates you have to provided a service service file so here what you can see if you basically download to your cloud SQL proxy and then you make it executable and then you can run it either locally or you can run it on your TC instance or if you are familiar with kubernetes which is a container management system you can run it directly in your pots as a sidecar to connect to your cloud instance it’s pretty much straightforward who here has done something with MySQL or any SQL database alright amazing so this looks very familiar relational schema I’m creating a database and some tables in this case I’m creating a user user table the cloud aficionado by creating a balance table which is basically my account balance and I’m creating a transaction table which shows me the history of the transactions that have been done and then I can do things like a like a distributed not a distributed but it’s transaction which goes across multiple tables so here you can see I have a transaction that actually takes some money from the balance table puts it somewhere else in the balance table and also writes in my transaction table that this has been happened so it’s pretty much straightforward now the next step is my app goes viral all right so now we’ve already even want to do is we want to need like a scalability check so when I build my cloud sleep loudly pictures 3.0 I want to be to be scalable for a large amount of people so let’s check cloud storage is scalable it’s loud fire store or five store on the firebase it’s also scalable it is a multi tenant servers you don’t have to free provision any resources you basically just use it with your application and the more you use it will scale with your needs amazing now cloud SQL unfortunately it’s one solution that doesn’t scale with your needs you can scale it up to a certain point basically the biggest VM that your cloud provider or in our case we will cloud platform provides you but if you run out of the size of your of the biggest VM you kind of stuck there are many many things that you can do you can of course do charting and build like a distributed or DBMS but it’s very challenging and costs a lot of time investment is very risky so let’s now where I want to plug into loud spanner cloud spanner is our horizontally scalable and distributed relational database system so you get strong consistency with it and it scales with you need you can scale to thousands of nodes and it scales horizontally your schema permitting if you want to learn a little bit more about cloud spanner there’s a shameless plaque at the at the bottom so if you search for TI al about cloud spanner you will find a video series where I go through some of the nitty gritties of an amazing things that we do with cloud spanner so let’s recap what is the portfolio under the Google Cloud storage Google cloud platform that we have in terms of data and storage products so we have covered cloud SQL we have mentioned cloud spanner we have mentioned cloud datastore firestore there are a couple of others here on the on the slide so if you use App Engine you can use for inside App Engine memcache if you use if you have a lot of data which you want to search really fast the process really fast so if you want to have like a very high weeds through or to read and write throughput for non-relational data then we recommend to use cloud BigTable Oh cloud pigtailed has an age based interface so you can connect it to your likes park or things like that then we also have a data warehouse system manage data over our system which is victory where you can go and load relational data in and crunch terabytes of data with bigquery all right so with this I thank you very much for for your attention if you have any questions please feel free to walk up to the microphones I’m happy to answer some of your questions and please also if you want to follow me on twitter under at hosta rosti and i’m happy to answer any kinds of questions so Twitter as well so thank you very much for your attention you I hope you enjoyed the talk and have a good kugel IO all right any no all right thank you very much hello and welcome to I alive I’m sorry Nolan tonight’s cool your host and I’m here with Harris product manager on Android auto hi Harris – arena so tell me what’s new for developers for Android auto so this is a very exciting year we have all the newest sets of the changes in the api’s for media and messaging developers which is the biggest change that we made in a long while for the media what the developers can do is they’re going to be able to do content browse forward which is pretty exciting and they’re going to have a different set of layout so they can use to actually present their content better to the user for the messaging we have a new messaging styles which is going to enable them to do for example group chat and a couple of different things and for all the thousands of developers that we have for Android outter that’s pretty exciting am I able to already download the apps directly and get in the car yeah so actually what we have right now is we’re also showcasing the preview of the Play Store Google assistant in the vehicles so we’ve announced today that we’re gonna be shipping some of these Google services with Volvo so that’s pretty exciting anything else but not on Thomas no it’s like I would say it’s like you’ve developing for Android auto if you’re media messaging apps we’re excited to work with you check out the documentation that we have online and start developing Android apps thank you thanks marina 4i alive hello and welcome back to AI alive I’m Timothy Jordan and I’m standing here with Jorge Jorge has been applying machine learning to diabetic retinopathy detection hi Jorge thanks for joining us all right nice to be here okay let’s start with what is diabetic retinopathy so diabetic retinopathy is the main cause of blindness among working age adults in the world and the sad thing is that it’s 90% preventable if patients would get regular eye exams and timely treatment the problem is that people don’t get regular eye exams and so we’ve been going out and getting images of the retina in order to get a diagnosis to see who is at risk for vision impairment and how do you apply machine learning to the problem so right now an ophthalmologist or an eye doctor optometrist has to look at the image and determine what the level of retinal disease is before patient can be referred you know if they need referral for treatment so with AI you can get an immediate grading of what’s going on at the retina so you don’t have to wait sometimes an hour two hours or two or three days or even weeks to get a response you can act immediately for people that need care and I imagine this is a scalable thing as well for areas that don’t have access to the same medical help right so vision impairment is expected to increase by three times by the year 2050 so there’s just not enough manpower there’s not enough eye doctors in the world to be able to take care of everyone so this is really the only chance to be able to get the people who need care into treatment to be able to identify them and make sure that they’re able to get treated yeah what’s next after this what else do you think that you can apply machine learning to four people yeah so that’s really exciting it looks like AI can see a lot of things in the retina that humans can’t so we’re looking to apply it to cardiovascular disease neurological disease and a host of other problems that that we’re hoping you know that the the algorithm can find out with just a simple retinal image awesome or hey thanks for sharing your story and thank you for the work that you’re doing thank you I’m Timothy Jordan and this is i/o live here in my parents play guitar that’s probably my first memory when I started learning about how sound waves are just vibrations in the air my eyes open to like a new world and that’s I think when it clicked that I knew I wanted to do engineering Daniel had a musical background that’s an important part of working on audio is you have to have an understanding of the way sound works the scientists said and Bari wanted to figure out a way to in any given day count how many blue so the process involves recording audio from the ocean but it’s too much information for a human to try to look at and listen to we were able to turn those sounds to spectrograms which are just images of what sound looks like and fed those images into tensorflow this machine learning tool it allows us to take a mammoth pile of data and distill it into something meaningful we can answer a lot of questions about the way we are affecting the marine environment and how we can help conserve it when I was a kid I didn’t think I was gonna be a scientist or an engineer but knowing that I’ll never stop learning makes me feel pretty lucky hello and welcome to I alive I’m fluent and I school your host and I’m here with Harris product manager on Android auto hi Harris – arena so tell me what’s new for developers for Android auto so this is a very exciting year we have all the new sets of the changes in the api’s for media and messaging developers which is the biggest change that we made in a long while for the media what the developers can do is they’re going to be able to do content browse forward which is pretty exciting and they’re going to have a different set of layout so they can use to actually present their content better to the user for the messaging we have a new messaging styles which is going to enable them to do for example group chat and a couple of different things and for all the thousands of developers that we have for Android out it is pretty exciting am I able to already download the apps directly and in the car yeah so actually what we have right now is we’re also showcasing the preview of the Play Store Google assistant in the vehicles so we’ve announced today that we’re going to be shipping some of these Google services with Volvo so that’s pretty exciting anything else Donatello’s no it’s like I would say is like you’ve developing for Android auto if you’re media messaging apps we’re excited to work with you probably then check out the documentation that we have online and start developing Android apps thank you thanks this is Laurie now for AI alive hello and welcome back to AI alive I’m Timothy Jordan and I’m standing here with Jorge Jorge has been applying machine learning to diabetic retinopathy detection hi Jorge thanks for joining us nice to be here okay let’s start with what is diabetic retinopathy so diabetic retinopathy is the main cause of blindness among working age adults in the world and the sad thing is that it’s 90% preventable if patients would get regular eye exams and timely treatment the problem is that people don’t get regular eye exams and so we’ve been going out and getting images of the retina in order to get a diagnosis to see who is at risk for vision impairment and how do you apply machine learning to the problem so right now an ophthalmologist or an eye doctor optometrist has to look at the image and determine what the level of retinal disease is before a patient can be referred you know if they need referral for treatment so with AI you can get an immediate grading of what’s going on at the retina so you don’t have to wait sometimes an hour two hours or two or three days or even weeks to get a response you can act immediately for people that need care and I imagine this is a scalable thing as well for areas that don’t have access to the same medical help right so vision impairment is expected to increase by three times by the year 2050 so there’s just not enough manpower there’s not enough eye doctors in the world to be able to take care of everyone so this is really the only chance to be able to get the people who need care into treatment to be able to identify them and make sure that they’re able to get yeah what’s next after this what else do you think that you can apply machine learning to four people yeah so that’s really exciting it looks like AI can see a lot of things in the retina that humans can’t so we’re looking to apply it to cardiovascular disease neurological disease and a host of other problems that that we’re hoping you know that the the algorithm can find out with just a simple retinal image awesome oh hey thanks for sharing your story and thank you for the work that you’re doing thank you I’m Timothy Jordan and this is i/o live here in my parents play guitar that’s probably my first memory when I started learning about how the sound waves are just vibrations in the air my eyes open to like a new world and that’s I think when it clicked that I knew I want to do engineering Daniel had a musical background that’s an important part of working on audio is you have to have an understanding of the way sound works the scientists Adam Bari wanted to figure out a way to in any given day count how many blue ELLs there were so the process involves recording audio from the ocean but it’s too much information for a human to try to look at and listen to we were able to turn those sounds to spectrograms which are just images of what sound looks like and fed those images into tensorflow this machine learning tool it allows us to take a mammoth pile of data and distill it into something meaningful we can answer a lot of questions about the way we are affecting the marine environment and how we can help conserve it when I was a kid I didn’t think I was gonna be a scientist or an engineer but knowing that I’ll never stop learning makes me feel pretty lucky hey everyone wow didn’t do anything already got applause it’s a pretty good start I see a lot of familiar faces but the stage is getting bigger every year so hey my name is Sasha puta product on the product team for Android TV I’m Benjamin Baxter devrel for Android TV and thanks for joining the Android TV session at Google i/o the session is called what’s new but we’re also talking a little bit what has happened in the last year so thanks for joining and yeah let’s get going so last time most of us talked was last year at Google i/o and the Android TV acro system since then has grown significantly and we are obviously really happy about that so we have now more than a hundred partners working with us on Android TV devices and as you can see we are doubling that so far every year and as you can see the growth comes from a number of devices and device categories we have maybe heard at CES earlier this year a bunch of new partners in very different Smart TV price categories launching Android TV devices so we are in the very lucky position that we can’t even list all partners on one slide anymore so if you’re here if you’re working with us on Android TV devices and your logo is not here sorry we’re running out of space but also in the setup box space and specifically when it comes to pay TV operators we see significant growth and we are very happy about that so we already have a thing around 30 partners worldwide so pay TV operators cable satellite IPTV shipping boxes with Android TV and we have more than 50 additional ones coming so we are really happy and we see good growth there but it’s not only about hardware it’s also about software and that’s why I’m I’m really happy that the TV app ecosystem is growing significantly as well I think this is the fifth time I said significantly so I will stop doing that but all of you are contributing to apps having developers here at i/o over the last few years actually pick up Android TV and build cool media experiences games or completely new stuff we didn’t think about is really awesome when we started to play the TV play store I think in 2014 we had 25 apps or so now we are approaching 4,000 and a thing because here there are a lot of app developers here we wanted to mention because it’s a little bit under the radar we have also launched DCB on android TV a lot of you if you’re developing for mobile you know direct carrier billing already on mobile so if someone buys an app if someone doesn’t in-app purchase or bicycle in your car in a game you can charge it to your mobile bill in a lot of cases if you don’t have a credit card or you don’t want to use the credit card and the same concept has launched also since last Google i/o on Android TV and we’re working with more and more partners pay TV operator partners that you can’t do charges from the Play Store or the Play ecosystem in general to your cable or satellite or TV subscription bill but then also one of the things we have started putting more emphasis on is the Google assistance you heard us briefly talking last year at Google i/o about this and since then the ecosystem and the services that the Google assist Inc brings to Android TV have not only to Android TV but to devices in general has grown so we see a lot of use of the assistant on TV we think it’s a great additional tool to make use and consumption of media services and app a lot easier on TV by just interacting with voice right discovering content by simply talking to your TV having a very natural interface not having to remember what was that search phrase again just natural and it’s easy and it works but it’s not only about finding content it’s also about looking up answers to questions you might be interested in it might be who that actor is or it’s also control of other devices in your house the assistant on Android TV gives you all of that and so because we think it is so important we are we are this year putting a lot of priority on getting the assistant into more countries we want to make sure we do it right so we don’t want to rush anything but since we have launched in the u.s.

In late 2017 we are ringing this year as you can see a lot more countries into the assistant ecosystem on Android TV but even if you are in a country or a language that is not yet assistant enable and you have to wait a little bit we refresh our voice search experience even in those in those countries so I mentioned earlier that we’re really happy about all the apps and more and more apps in the Play Store and we last year here at Google i/o gave you a little bit of a sneak peek of the new Android TV home experience and the system UI and how apps can integrate with that back then it was a it was a sneak peek it was a preview it has launched with Android Oreo and we are now have a lot of partners starting to roll this out to Android TV devices so we thought it’s a good opportunity to go into more detail how your apps can actually take advantage of this content first experience that we are trying to achieve with Android TV and then is going to talk a little bit about what it actually means content to have a content first app hey thanks Sasha so we really wanted to redefine this experience once wanted to be very content first content driven if you’re familiar with the previous screen we have one row for recommendations a double row for apps double row for games we’ve changed as you can clearly see so apps are still important we’ve have one row for apps it’s just your favorite apps your users can customize and add your app to this row it’s up to the users discretion that recommendation row from Android n and below we’ve broken it up into several different responsibilities so we have a play next row where you can add content and we’ll talk more about this later but you can add content to be picked back and engage with user later the next thing we did was we broke out and each app can have their own channel they can have their own surface on the home screen we’re trying to really push this content first design and every app can can contribute to it as Sascha mentioned earlier with the assistant the assistant is also on the home screen it’s not specific to Android o or Android P it actually supported from Android M and above so this is here to enhance search and keep pushing that content first design okay I keep saying content first it’s like my favorite phrase all of a sudden but let’s look at some really good examples before we dive into the technical details about what builds a content first design so here we have Google Play Movies and this is from the home screen and you can see everything about their details it’s as if the the movie details screen is right there on the home screen to help build and engage in experience this is great for users they don’t have to go into the app and hop back out they can make decisions right there from the home screen if we take it a step further we have video previews so if you have a movie a trailer makes a great video preview if you’re an audio app ten-second clip is a great audio preview we found that having previews is very engaging for users and drives engagement into your app I said earlier about the play next row this is a great place for you to add content to bring users back in if you’re watching a really long movie you add a little bit extra metadata we’ll make a beautiful progress indicator to add more context and the users are able to hop back into the app and I said channels you’re not just limited to one little row on the home screen you can have as many channels as you want if we look here you know we have channels like featured top free these are great channels that you know we’re going to be fresh you know these have indications of being updated daily even hourly okay so now we have a nice context about what’s on the home screen what’s dive into how you guys can build this experience so what is a channel a channel is just a logo and a name that’s all it is it’s a container for programs this is gonna be where you theme your content and and customize everything deeper in the programs to build a channel we have a support library it just uses the Builder pattern super easy super convenient for you everything on the home screen should be clickable so if you click on a logo it should open back into your app so just set the app link URI and this will let you open up your app from the home screen the next thing that’s important is the internal provider ID this is an ID for your app to tell the home screen hey keep track of this I know what this ideas and when I go to query my channel later I can synchronize my channel with what the home screen knows about my app and that’s it it uses a content provider and we have convenient methods for those of you that know the know the joys of content providers you know that there can be boilerplate code we’ve abstracted all that into the support library and we make it really easy for you to build a build a content provider API those also familiar on some providers know that you are eyes are kind of tedious and cumbersome to manage the support library manages all of that for you it’s just a simple insert statement and you get back a channel ID like I said earlier with the internal provider ID you can use the channel ID and the internal provider ID to make sure when you synchronize and update your channels later you’ll have all the keys you need okay channels like I said they’re just a shell of a row the programs are what really matter so let’s look an example of a program in this program we have a bunch of stuff happening we have a thumbnail image we have a title a description and anything else you know about this program can be added to the home screen the more metadata you add the richer the experience the better the user engagement so anything that’s in your details page it’s perfectly natural to put on a home screen again Builder pattern this time you want to set the channel ID on your program so the home screen knows where to put it on the home screen and if your program has more than one channel it can be in make sure to add a unique identifier the home screen will look at this identifier and say hey we already know about this program in this channel and it’s also in this channel we can have D duping logic so that the user doesn’t see duplicates and adds a cleaner smoother experience you also want to set the type in this example we set the type to movie but the type will drive the metadata that’s shown on the home screen and I showed you with redbull earlier video previews it’s as simple as just adding a URI so you just say hey my video previews at HTTP dub-dub-dub that my video comm and the home screen will play the video for you it uses exoplayer under the covers so all of this supported video formats from exoplayer are supportive for you out of the box if you have DRM or you have a bit more complex use case we do support a solution where you can draw your video on a surface come see us and office hours tomorrow and we can go in more depth about that this looks very similar to channels you know you convert them to content values and then from your content values you’ll need a URI the URI is dynamic since you’re trying to insert a program into a specific channel so use the the support library after you insert you’ll get back a program ID now your programs in the channel and you have a program ID and you have a channel ID you have all the tools you need for synchronizing updating and deleting later when you go to update your content okay I told you how to make the channel you know just use the the Builder pattern add it to the content provider but when do you make it win is very important we put a lot of time trying to figure out how to get you started and so we ended up with a new intent so there’s initialized programs that that will get triggered and this can happen before your app starts which isn’t a bad idea you know your app just gets downloaded onto a TV or it gets updated and all of a sudden you have content on the home screen great entry way for users to come into your app just listen for this initialize programs and just set up your channels and start your process okay the last thing we talked about was the google assistant I think the Google assistant is a pretty fantastic experience it pushes that content first approach even further things you can do to integrate the Google assistant is to implement search I support deep links and handle playback controls so let’s look farther when you perform a search with the Google assistant the Google assistant keeps this content first design search my favorite movie Big Buck Bunny hope it’s your favorite too and you’ll see a bunch of information the thumbnail the description the title and if your app matches you know if it matches on the title duration year at the minimum with the metadata returned it will show up in the list of apps that’s available on how do you match this what happens under the covers we use more content providers this time your app is providing data to the Google assistant the Google assistant will just pass a URI for you to perform your search on in this content provider you can do whatever you want local database calls Network calls whatever you need to do to perform your search you can do it this URI contains the raw search string from the Google assistant but it will be sanitized so if you say something like play Big Buck Bunny it’s going to send you Big Buck Bunny and realize that play is a command so just be cognizant that the URI you get is to be the key for what you should search for okay controlling playback if you guys are familiar with media session you don’t have to do anything your app already supports it but we’ll just take a closer look if you want to if you want to support the the Google assistant and do different commands while watching shows for example hey Google pause the movie in your app you just implement media session supply a callback with this function on pause or on play and the Google assistant will trigger that through media session on pause on play yeah you can just hit the pause play button on the remote maybe not the most useful feature but it’s there and it’s really it’s actually really cool experience my favorite one is on seek to you say hey Google fast-forward five minutes and the Google assistant actually does the math and says hey your media session has a state I know the current position in that state you said fast-forward five minutes that’s 300 seconds so I’m going to send the new position to your app if you’re familiar with the media session you know that there’s on rewind and on fast-forward you should still implement those methods but the Google assistant ignores those since it already does the math to figure out where to send the user to this next one is more useful for audio apps more so than video apps but if you say hey Google play the next song or okay Google go back and play the previous song all you have to do is just implement on skip the next and on skip the previous so to recap what are some of the best practices on the home screen you should listen for initialize programs this is the gateway for creating your programs and building that experience on the home screen when it comes to metadata you cannot have enough the more metadata you add the richer the experience the better engagement from users keep it on that theme of engagement ad previews it’s going to really drive engagement and build a nice experience for users keeping content fresh if you have stale content you might lose trust with users you know if you have a channel on the home screen and you haven’t updated it in a year you know once the user watches all those programs they don’t have too much of a reason to go back in so you want to keep that content fresh and that that depends on your app if you’re an app like YouTube or haystack and you have user curated content maybe update every couple hours if you’re an app like Google Play Movies where the contents already controlled maybe update once a day or twice twice a week or so the goal is to keep the users fresh keep the content fresh so users have something to engage with every day and when it comes to the assistant the media session is your friend anything you want to do with Google assistant media session is the key the good news is all of this stuff you can do today it’s live on Android oh and and I’m gonna turn it back over to Sasha to talk about things you guys can do in the future thank you very much thanks man it’s interesting to see also the different reactions in the audience like some people taking notes and taking photos and discussing the the code slides and other people like dude just launched some devices so we’ll talk a little bit more about what’s new you heard this morning already from Dave Burke and some others what what’s the new stuff in Android pee and of course Android TV will also get some new additions in Android pee so I will only highlight a few areas we’re investing in one is definitely performance we have done a lot of work in android p to actually make Android TV perform better faster even on entry-level hardware so even on some of the very affordable Smart TVs you might have seen or set up boxes we want to make sure everything is iffy it’s fast and we have invested a lot in that an Android P and as a developer some things to take a look at is think about your app and your app behavior is there maybe something like some animation some feature you might want to disable or tweak on low memory devices we see that’s one of the most common problems for some apps on entry-level hardware so take a look at the is low RAM device and human your app behavior to that play around a little bit with with it maybe it’s some tweaks to animations or maybe on certain device devices disabled disabled animations or take a look at certain features also use the memory profiler to really check your app profile it see where it might run into some bottlenecks because we see that still being some of the most common behaviors that an app suddenly has a drop in fps and also then use android vitals to monitor the performance of your application but again you see you should see Android pee on energy V devices perform much faster and much better and we will see more devices because of the reduced Hardware envelope but we also obviously wanted to update some things for users and make make it even easier to get to a fully setup Android TV experience I mean let’s let’s be honest not everyone enjoys necessarily setting up a new device and we wanted to make it the whole experience of from unboxing the device to have everything ready Li set up be a lot easier faster and you don’t have to deal a lot with it so in Android P the whole setup process is a lot more streamlined I think we reduce the overall time it takes you to go through there by almost a third and we also have some additional clue queues in there where the phone integration is better if you have an Android phone you will get a set up notifications so you can transfer account details for example your Google account you can easily transfer that to the TV also we have well we have improved the browser sign-in for non Android devices so iOS devices or maybe your laptop we improved that a lot it’s it’s a lot faster it’s a lot easier and so we should get you through set up a lot easier but that’s only setting up the device with your account and making sure all the all the settings work what you also want on a smart TV or an OTT box you also want all your apps right you want all the content so as part of the set up process we also have what we the next generation of play auto installs and we already recommend you apps that you have installed on other Android TV devices on past Android TV devices or maybe on your phone if a corresponding TV app exists so we automatically suggest that to you as part of the set up flow and you can select there yes I want all these apps or maybe say oh I want only a few of them just a few clicks checkboxes you say go and everything is being downloaded and installed automatically no more manual search in the Play Store after you have set up a new device but downloading and installing an app is only one piece that we wanted to optimize what’s also not a great experience is going into an app and then discovering that you have to sign in with the username and password and having to awkwardly enter that with your remote control and so that’s why we’re introducing also autofill with Google on Android TV so if you have ever entered your credentials on one of these apps on another Android device we automatically suggesting that those to you you just say yes and all the login credentials are being added you’re logging in automatically without entering and usernames or passwords so that’s set up but also one of the thing one of the area where we saw a lot of users spend a lot of time and searching for stuff and obviously not really enjoying it is settings right sometimes you need to tweak stuff or you have skipped a certain set up step when setting up your new device or you want to change your audio settings or add a different account or something like this and we have revamped settings on Android TV to be a lot more streamlined a lot less cluttered and we will automatically suggest to you settings we think you are looking for and for example when you skipped account login we as you seen the animation here will already highlight that or when we saw you tweak around with some apps and something wasn’t right and you go into settings we will maybe highlight the app setting for you we want to make it as easy to quickly get in get the stuff tweaked you need and get out again so there was a lot about Android pee if you can download the NRP preview SDK and can play around with the with the MU also with the TV emulator and try it out there are some other neat things in there for example external camera support so you could start writing camera apps for TV if you want to so you can can start playing around with it we are also going to release a lot more documentation around the performance aspects of TV apps in the near future but we wanted to talk about two more things I mean we talked a lot about software so far but we also thought about what a cool hardware experience what are cool devices that could explain the experience in the living room and one of the questions we asked ourselves is how would a really cool living room experience where the device being the center the hub of the living room with assistant integrated very easy to use controlling all my other devices and bring me the best Android TV experience onto my TV screen how would something like that look like and the outcome of that asking the answer to that was something like this hey Google turn on the TV okay google play the Clemson football game on ESPN okay what’s it called again curiosity opportunity hey Google what’s the new Star Trek show here is some information about Star Trek discovery hey Google show me potato battery videos on YouTube kids all right here’s potato battery videos on YouTube kids college football over time according to Wikipedia it is commonly ten minutes long this is good this is really good hey Google buy more kettle corn here’s what I found on Google Express I meant the show I know hold up hey Google dim the lights turn off the TV and play some music on Pandora so as you might have seen already starting yesterday in some media outlets we announced without together with our partner JBL the JBL link bar powered by Android TV it’s an awesome sound bar it has Google assistant integrated and because we put far-field microphones into the device you don’t even need to pick up the remote control you just sit on the sofa start talking to the device and can control the whole experience it has really awesome sound there will be an optional subwoofer that you can order with it as well and but what I think a really cool feature is other three HDMI ends and you might think well why is that so cool you really can’t control your whole home entertainment experience with this because a lot of you might have actually well you might have a Playstation or an Xbox right or you might still have that blu-ray player or whatever other or maybe that cable set-top box you still really like so you can connect all these devices to the JBL link bar and then use the assistant to control for example you want to switch to the PlayStation and it just magically happens and also if let’s say you’re playing a game on the PlayStation and then you have a question for the assistant you just ask the assistant and the assistant can answer can show you the answer cards over your current HDMI input so it’s not just the pass-through HDMI these are active inputs and it really shows you the power of the assistant and Android TV as a living room platform and the other cool thing here is that it works that we that we are working to get this the into a speaker only mode as well because sometimes you don’t need the TV screen right you don’t to listen to audio to listen to music to maybe just ask something really turning this down into a very low-key smart speaker it’s also cool so this device can do all of those things and maybe sometimes you just want to send some use by a Bluetooth or a cast audio – the device works as well it’s your one-stop shop for the living room you can control everything it’s super easy it will also come with a remote control but if you prefer to just use it with voice well it is really easy you should check it out we have it in our Android TV sandbox here in sandbox C and ask some of the folks from the team to give you a demo it’s really cool it sounds good and it will launch in fall 2018 so I said two more things right so one other thing we talked a lot about Android pee earlier and like as developer you want to follow all the things Ben just told you and you might ask yeah you told me I can download the Android pee SDK preview SDK and work with the emulator but I really would like to have actual hardware for this so I don’t know who of you was around was at Google i/o 2014 at Moscone Center in lots of hands okay some of you were already in Moscone and 2014 and they might remember we launched a device back then to introduce Android TV was called the ADT one so we thought well you can applaud if you want but we thought it’s time for another developer device so we are introducing ADT to so you might want to write down that signup form link because you’re the first people to actually see this and be able to sign up for this all the smartphones we it’s actually a pretty good test for google lens the actual anyway if they go to the sandbox later and check out the sound bar and use badge in you also get an email to slink as well yep good point so this is a neat little hdmi dongle it will come with a bt le enable its sorry a voice enabled remote control so you can try out all the assistant integration tips that Ben just talked about it will run an Android P Developer Preview release we will send updates to the device by the way we are also sending updates to the soundbar Google or the Android TV team cents will send the system updates and you Android versions right through the device and so you can use this device to try out all the cool things it’s a limited edition device so you know about it first sign up we will actually start to send this out later the summer and needle device we are actually really excited about it and sign up for it and build cool apps that’s kind of it already I said two more things it was two more things we have if you’re working on apps we have some more interesting events that you might be interesting to might be interested in then I don’t know if you want to mention come visit us in office hours we have back-to-back app reviews and office hours we’re going to be down the road a little bit with office hours tent is if you have questions come bring them if you’re bored and you still want to do TV stuff we have two new code labs in the code lab section one focuses on just the play next row and the other one focuses on an overview of the entire home screen we look forward to seeing you and bring your questions there yep and talking about questions well obviously we want feedback from you but also if you have if you have questions you see those microphones we’re not using them because of timing we will get shuffled out here very soon but Ben and I will be over right after this talk in the Android TV sandbox come over talk to us ask us questions and I will most likely talk point to Ben to answer those questions thank you very much bill cool TV apps and I hope we’ll see you next year thank you any matter and others for now Android KTX is in preview and the API is likely to change before reaching the stable version but here’s how to be integrated in your projects to check it out some examples of usage and how you can contribute in the build of Gradle file of our apps at the Google repository if it’s not there already and then add the Android KTX library to your dependencies let’s start with a simple example let’s say that we want to create a uri from a string normally you would call uri dot parse on the string but with android KTX we can just call to uri on the string when we want to save a value in share preferences our code will look like this really the Preferences put the value and then call apply with the extension we just need to call edit and pass a lambda block with the action under the hood it actually does the same thing as the previous code working with classes from the graphics package we’ve added extension functions for some of the most important classes their canvas bitmap path color and others so let’s say that we want to draw the difference between two paths offset downwards by 100 pixels first we get the difference of the paths then we translate the chemist and afterwards we draw the new path all of this code can now be simplified like this okay let’s take another example this time with a view and say that we want to execute an action before the view is drawn the default implementation requires us to add an own pre-draw listener make sure it’s removed before the action is triggered and then trigger the action with entered KTX we can just call do on pre draw on our view and trigger the action these are just a few of the extensions available so far check out the docs to find out what else we have there if you want to suggest more ideas for extension functions and you’d like to contribute check out the Android KTX github project we’re still in preview the API is not stable yet so your feedback is valuable and can shape the library this is just the beginning of Android KTX we’re working on extensions for support library and architecture components coughing on Android is here to stay and we have big plans for it follow us on github YouTube and Twitter for morning yes you hello and welcome back to IO live I’m Timothy Jordan and I’m standing here with Nile and with Gaza and they built an app using tensorflow I’m gonna talk about it a little Nile could you tell me what the app is and how you came up with the idea sure so the app is actually a plant classification app you basically take your camera and hold it over a plant leaf the app then tells you what type of plant it is it tells you if it’s healthy or if it has a disease and what the disease is and we actually came up with it one day and Lich so Shaza had been talking about wanting to create an app and we were brainstorming some ideas and I thought about my mom who has had a huge passion for gardening about a year ago we ended up moving and so she had to transport her garden from a big backyard to a tiny little patio an apartment thus all of her plants were dying so we ended up making happen helping her garden flourish awesome aza what was it like getting started with machine learning hi I got started with machine learning through a tensorflow code lab called tensor corporate poets and I learned all about how a machine learning work and how to use convolutional neural networks on your own data set using the tentacle API did you find that challenging it was very challenging it took me a long time to learn because with my very first programming experience by answering a really amazing project awesome now what’s next you have any ideas for new apps yeah so I definitely want to do something that has to do with women in third world countries I have a really close friend who’s actually a national leader in the organization girl up and so their their topic is just dealing with women in third world countries so we’re definitely working together right now on something to benefit that okay one last question do you have any advice for somebody out there who’s looking to get started in machine learning or even just computer science to someone who’s looking to get started in computer science I would tell them that it’s a lot easier than you and there are a ton of resources out there if you want to learn about machine learning or artificial intelligence awesome thank you so much and thank you for joining us this is AI alive hi and welcome to our live I’m Florida Montanez from your host and I’m here today with Sasha oyster director of product of Android TV – Rina so tell me Sasha what’s new with Android TV Hey yeah we have a bunch of stuff that we excited to talk specifically with developers at Google i/o because it is a developer conference after all so we do a lot of talks and sessions here about how TV app developers can actually integrate really well with Android TV so as you can see on the screen if I just stroll around a little bit we want we want app developers to get their top content and like really beautiful pictures all the metadata on screen but we want to make sure it’s not only YouTube or HBO that can do that stuff we want every app developer can do this right so we talked a lot about new ways how you can do this integration really nice what ways we have to help you with this and basically want to enable developers to have really the best content on the TV and make the content really shine because you want to help them to make the apps popular right and then obviously it’s Google i/o so we’re talking about Android pee as well so we have we have a bunch of new stuff in Android PE for users but also for developers so some examples for users are we ever get a very new like set up flow that gets you all your favorite apps already when you set up a device we have features like autofill where if you have already locked in on some other device you already get your login credentials on Android TV so you don’t have to clumsily enter passwords with the remote control no one likes that right and then but also there the cool new platform API is in in Android P where for example with external camera support so you could start the develop camera apps on TV as well yeah that’s some of the stuff with NYP when where you can use the the new Android pee preview SDK it has a TV emulator and you can get started with that if you want I know last year you launched the assistant in the US but I live in London can I finally use it in London also a very good question actually the international support for the Google assistant is role on Android TVs rolling out as we speak so specifically the UK and some other European countries rolled out in the next few weeks but since you mentioned the assistant one really cool thing we we announced at Google i/o is our new sound power project that we did with JBL it’s the JBL link bar that you can see here and we really like the combination of having a TV device powering a TV screen but with really awesome sounds and a five-minute microphone built in so you don’t really need this remote control anymore you can basically just sit on the sofa and control your TV device or any device connected to the sound bar so you can ask things like hey Google what’s my agenda today today there’s only one thing on your calendar it’s at 6 p.m.

And its title is pick up the dog yeah I really shouldn’t forget that but it’s really nice you don’t you don’t have to get up from the couch look for the remote control anymore you just talk to your TV in a very natural way hey Google go home ok great so I can continue losing my remote control – all right still I want Android developer so what can I do practically can I is there some code if I can already right yeah I mean you could download the Android pee preview SDK with your emulator but we know developers want to work with real hardware so one really cool thing we’re announcing here at Google i/o is the ADT to developer device it’s for those of you who have been developing for Android TV for a while might remember we had an ATT one developer device in 2014 launched at Google i/o and after four years we thought it’s time to do that again so it’s a neat little Android TV dongle you can connect to your TV you can sign up for this we have a sign up form live and we are selecting it’s a limited edition device but we make sure a lot of developers are getting one of those it’s a real cool small Android TV dongle voice enabled remote control so all the things we’ve been talking about the content integration assistant integration for your apps we can try that all out on the device great thank you very much Sasha so we already have a lot of new things for a new TV both for developers and for end-users so now all we need to do is check out the developer documentation and also sign up for the dongle Thank You Sasha thank you um this is for ena for our life to make Android development with cotton more clear license and idiomatic when working with android framework classes we created android cousin extensions a set of extensions to the framework that covers some of the most commonly used classes like view shared preferences canvas any matter and others for now android KTX is in preview and the api is likely to change before reaching the stable version but here’s how to integrate it in your projects to check it out some examples of usage and how you can contribute in the build of Gradle file of our apps at the Google repository if it’s not there already and then add the Android KTX library to your dependencies let’s start with a simple example let’s say that we want to create a uri from a string normally you would call uri dot parse on the string but with android KTX we can just call to uri on the string when we want to save a value in share preferences our code will look like this really the Preferences put the value and then call apply with the extension we just need to call edit and pass a lambda block with the action under the hood it actually does the same thing as the previous code working with classes from the graphics package we’ve added extension functions for some of the most important classes their canvas speed map path color and others so let’s say that we want to draw the difference between two paths offset downwards by 100 pixels first we get the difference of the paths then we translate the chemist and afterwards we draw the new path all of this code can now be simplified like this okay let’s take another example this time with a view and say that we want to execute an action before the view is drawn the default implementation requires us to add an own pre-draw listener make sure it’s removed before the action is triggered and then trigger the action with android KTX we can just call do on pre draw on our view and trigger the action these are just a few of the extensions available so far check out the docs to find out what else we have there if you want to suggest more ideas for extension functions and you’d like to contribute check out the Android KTX github project we’re still in preview the API is not stable yet so your feedback is valuable and can shape the library this is just the beginning of Android KTX we’re working on extensions for support library and architecture components content on Android is here to stay and we have big plans for it follow us on github YouTube and Twitter for more news you hello and welcome back to IO live I’m Timothy Jordan and I’m standing here with Nile and with Gaza and they built an app using tensor flow let me talk about it a little Nile could you tell me what the app is and how you came up with the idea sure so the app is actually a plant classification app you basically take your camera and hold it over a plant leaf the app then tells you what type of plant it is it tells you if it’s healthy or if it has a disease and what the disease is and we actually came up with it one day and lunch so Shaza had been talking about wanting to create an app and we were brainstorming some ideas and i thought about my mom who has had a huge passion for gardening about a year ago we ended up moving and so she had to transport her garden from a big backyard to a tiny little patio an apartment thus all of her plants were dying so we ended up making happen helping her garden flourish awesome aza what was it like getting started with machine learning hi I got started with machine learning through a tensorflow code lab called tensorflow for poets and i learned all about how a machine learning work and how to use convolutional neural networks on your own data set using the tentacle api did you find that challenging it was very challenging it took me a long time to learn because of my very first programming experience by answering a really amazing project awesome now what’s next you have any ideas for new apps yeah so I definitely want to do something that has to do with women in third world countries I have a really close friend who’s actually a national leader in the organization girl up and so their their topic is just dealing with women in third world countries so we’re definitely working together right now on something to gonna fit that okay one last question do you have any advice for somebody out there who’s looking to get started in machine learning or even just computer science cheers everyone who’s looking to get started in computer science I would tell them that it’s a lot easier than you think and there are a ton of is out there if you want to learn about machine learning or artificial intelligence awesome thank you so much and thank you for joining us this is i alive hi and welcome to our live i’m florida Montanez from your host and I’m here today with sasha ProStart director product of android TV every now so tell me Sasha what’s new with Android TV hey yeah we have a bunch of stuff that we’re excited to talk specifically with developers at Google i/o because it is a developer conference after all so we do a lot of talks and sessions here about how TV app developers can actually integrate really well with Android TV so as you can see on the screen if I just stroll around a little bit we want we want app developers to get their top content and like really beautiful pictures all the metadata on screen but we want to make sure it’s not only YouTube or HBO that can do that stuff we want every app developer can do this right so we talked a lot about new ways how you can do this integration really nice what ways we have to help you with this and basically want to enable developers to have really the best content on the TV and make the content really shine because we want to help them to make the apps popular right and then obviously it’s Google i/o so we’re talking about Android pee as well so we have we have a bunch of new stuff and Android PE for users but also for developers so some examples for users are we have a good a very new like set up flow that gets you all your favorite apps already when you set up a device we have features like autofill where if you have already locked in on some other device you already get your login credentials on Android TV so you don’t have to clumsily enter passwords with the remote control no one likes that right and then but also there are the cool new platform API sin in Android P where for example with external camera supports we could start to develop camera apps on TV as well yeah that’s some of the stuff with Android pee when where you can use the the new Android pee preview SDK it has a TV emulator and you can get started with that if you want I know last year you launched the assistant in the US but I live in London can I finally use it in London also a very good question actually the international support for the Google assistant is role on Android TVs rolling out as we speak so specifically the UK and some other European countries rolled out in the next few weeks but since you mentioned the assistant one really cool thing we we announced at Google i/o is our new sound power project that we did with JBL it’s the JBL link bar that you can see here and we really like the combination of having a TV device powering a TV screen but with really awesome sounds and a five-hit microphone built in so you don’t really need this remote control anymore you can basically just sit on the sofa and control your TV device or any device connected to the sound bar so you can ask things like hey Google what’s my agenda today today there’s only one thing on your calendar it’s at 6 p.m.

And its title is pick up the dog yeah I really shouldn’t forget that but it’s really nice you don’t you don’t have to get up from the couch look for the remote control anymore you just talk to your TV in a very natural way hey Google go home ok great so I can continue losing my remote control – all right still I want Android developer so what can I do practically can I is there some code if I can already right yeah I mean you could download the Android pee preview SDK with your emulator but we know developers want to work with real hardware so one really cool thing we’re announcing here at Google i/o is the ADT to developer device it’s for those of you who have been developing for Android TV for a while might remember we had an ATT one developer device in 2014 launched at Google i/o and after four years we thought it’s time to do that again so it’s a neat little Android TV dongle you can connect to your TV you can sign up for this we have a sign up form live and we are selecting it’s a limited edition device but we make sure a lot of developers are getting one of those the real cool small Android TV dongle voice enabled remote control so all the things we’ve been talking about the cons and integration assistant integration for your apps we can try that all out on the device all right thank you very much Sasha so we already have a lot of new things for a new TV both for developers and for end-users so now all we need to do is check out the developer documentation and also sign up for the dongle Thank You Sasha thank you um this is for ena for i/o life to make Android development with cotton more clear pleasant and idiomatic when working with android framework classes we created android cousin extensions a set of extensions to the framework that covers some of the most commonly used classes like view shared preferences canvas any mirror and others for now android KTX is in preview and

Add Comment