I now have the sample HelloTextGoogleCast application working.
I'd like to now retrofit another application that I have so that the onscreen images are displayed on Chromecast.
I can see in the HelloTextGoogleCast application the sending of text etc... should I now be looking at the media sample application or do I encode the image that I have somehow and send it as a text message? Looking at the docs here https://developers.google.com/cast/docs/ios_sender they leap from text to "media". Perhaps I should be using GCKMediaStreamTypeNone?
If the photo is not available or accessible from the cloud, then the right approach is to create a simple web server inside your iOS app to "serve" the images. Then for each image, send a simple text message to the receiver, pass it the URL of the image (since the sender is serving that, it knows the URL) and then modify the receiver you are using to get the url and set it as the src for an
<img/> tag. Note that encoding your image in ASCII will result in a string that is beyond the size that you can send in one message, so if you were to take that approach, you had to encode, then break it into multiple pieces (I think the max size for a message payload is 64K but I might be wrong), and then stitch them together on the receiver, decode it and then use an
<img/> tag which would be way more work (and much slower) than the approach I suggested.