Without tempting fate, it rectified itself before the Last Post and One Minutes Silence, and it has been fine during the first half.
Not sure that is strictly true. The answer is YES and NO. I Certainly streaming to a 42inch or 100 inch smart TV compared to a smartphone will consume more data but it is the selected resolution that is the reason. However 720p is 720p, 1080p is 1080p whether viewed on a small screen or a large screen. A 100inch TV will pull no more data than a 42 inch TV. Small screens like phones do not need the resolution required for viewing on a TV. Slightly Off topic, much of the work e.g. upscaling in the case of HD to pseudo UHD (4k) is done by the Device electronics. Obviously full UHD transmissions 4K require a lot of data and fast BB speeds. In summary... I believe when streaming sources like You Tube connect, your device negotiates with the source server to determine what resolution and frame rate to stream down to it. If you have a a 4K TV, it will request 2160p if the server provides that resolution, a standard HDTV will request 1080p and under some circumstances (like network congestion) both TVs might switch down to an even lower resolution. Phones or tablets can display the stream at even lower resolutions while still filling the screen and providing a sharp image.
Anyone keep getting kicked out, with a message along the lines of: "You may not have permission to play this video" every 10 minutes or so?
It's fvcking disgraceful, to be frank. You have paid for the service, and yet still get spammed with adverts. I am surprised more people don't kick off about it. The same with Sky subscribers. I previously cancelled an iFollow account, which I had paid for, when I was getting Sky Bet adverts before whatever it was I wanted to watch.
"I think we were unlucky, with us hitting the... oodles of noodles, thank ya, mah man!" It fills me with rage.
Thanks for the info. I was basing it on my web development experience where the client sends back the properties of the output device and the web server then sends the appropriate data for that device. I'll have to do some reading up. I'm watching on my laptop today, I normally connect it to the TV via HDMI.
Yes but I normally combine my trips to Oakwell with visiting family etc so that would mitigate it somewhat.
I actually ran a test some time ago streaming Netflix to my home cinema projector upscaling to 4k (Pseudo) then then switched to my upscaling 4k Smart TV. The data download was identical. Interestingly when my Pioneer receiver Output was switched to Sub (Projector) +Main (TV) the download data increased. So it appears using dual monitors increases data stream content. Nor sure why since at that point it is the home network router that is handling the data delivered via the modem. I could understand if the source material was different if my wife watching a TV program on one channel on the TV and me watching football on the PC upstairs. Screen size when the data inputresolution is fixed does affect quality. My old DVD stuff played on the projector is bad enough picture quality wise to make me watch it on the TV. Blurays especially when upscaled by the player look stunning when projected.
Mirroring from a phone to TV, in my experience means a crappy picture based on the senders pushed resolution 'negotiation' with the phone.. I am guessing it doesn't see an HDMI/mirror connection to a larger device which is simply acting as a dumb monitor. MY Panasonic Viera TV mirroring is flaky to say the least, works (or not) when it feels like.
Same for me all game every 15 seconds I’ve muted it now and messing about with my phone instead the TVs doing my head in.