I can finally use HDMI on my phone, and it’s not thanks to Samsung


Back when I got my Galaxy S II I was simply psyched to have a phone that had HDMI capability, and even a use for it with the video playback capability it has. I ordered the official MHL adapter, and was confused when I connected it to my DVI-only screen via an adpater – the same way I connect my laptop, desktop computer, PS3, camera, iPad 2 – and got 480p output. A bit of research later, and it appears that this is a bug/shortcoming of most of Samsung's mobile electronics, both tablets and phones. Seems that the devices cannot use a "dumb" one-way DVI adapter (the most common type, as HDMI and DVI is essentially the same thing) because it then has no clue what resolutions are supported on the receiving end, and unlike what seems to be every other device out there, it doesn't have to ability to simply guess and output something. 

Well, I wasn't really expecting Samsung to fix a software issue that has been included in every old and new device for years, so I just learned to live with it. I got some use out of the HDMI adapter while visiting a friend who has a HDMI-capable TV, and ended up giving him the adapter to use with his S II. Then, over Easter, my old monitor broke, and I ended up ordering a new one with HDMI despite being able to repair the old one to some degree (it's a time bomb now). I had to order a new MHL adapter, and I finally got it. At last, I can use my phone (and my Galaxy Tab 7.0 Plus) with my own monitor. 

So Samsung, well, not being able to use a DVI adapter is just a massive fail. Apple offers both HDMI (that works with DVI) and VGA adapters for its devices, making them great for business use, while Samsung's HDMI enabled devices can't even use DVI properly. Despite me now having two Samsung devices in my arsenal, I have no real brand loyalty to the company, and this is one of the reasons why (though not the only one). All I kept thinking when I read Calob's post about Nokia's awesome customer service a couple of weeks ago was "my phone can't even use a DVI adapter, and Samsung doesn't care". Sometimes it's the smallest things that can make you realize that a company doesn't care. 

Pocketables does not accept targeted advertising, phony guest posts, paid reviews, etc. Help us keep this way with support on Patreon!
Become a patron at Patreon!

Andreas Ødegård

Andreas Ødegård is more interested in aftermarket (and user created) software and hardware than chasing the latest gadgets. His day job as a teacher keeps him interested in education tech and takes up most of his time.

Avatar of Andreas Ødegård

15 thoughts on “I can finally use HDMI on my phone, and it’s not thanks to Samsung

  • Yeah, boo on them for actually conforming to industry standards. How dare they!

  • “Industry standard” for USB is 2.5W, but even devices that charge over a USB connection come with chargers 2-4 times that. Confirming to industry standards is good up to the point where you can add more functionality in order to conform to real life usability scenarios. Hence why none of my other devices have even the slightest bit of issue with a DVI connection – it would make no sense to limit themselves just for the heck of it.

  • Avatar of Anonymous Bosch

    This sounds like an HDCP issue, not a Samsung issue. I’ve run into plenty of instances where plugging a plain old HDMI source into a DVI-only display via a cheap adapter (that I bought without realizing it didn’t support HDCP) was no good.

  • Avatar of Anonymous Bosch

    On the other hand, if they were conforming to an industry standard (namely USB 3), they wouldn’t need to resort to hacks and wouldn’t be able to get away with having proprietary chargers, as you’ve written about before:

  • It’s not a HDCP issue. My DVI-HDMI monitor setup supports HDCP just fine, which my PS3 with Blu Ray discs prove quite definitively. Also, if it had been about HDCP, it wouldn’t have worked at all – it wouldn’t just have played at a lower resolution. The basic signal from the phone doesn’t requite HDCP either.

    When connected, the screen mirroring is the same resolution as the screen, 480p in landscape mode. When you start video playback, a proper HDMI-HDMI connection then causes the video output to be up to 1080p, while one that goes via DVI continues to play at the mirroring resolution. Both my S II and 7.0 Plus act this way, and some googling shows other Samsung devices do too.

  • Not quite sure why you think that would solve that particular issue? The way it works today is that a lot of devices charge using 5V and then anywhere from 500-2000mA. A device that can pull 2000mA off a charger still charges from e.g a standard 500mA USB port, but slower. It also doesn’t overload anything or cause any sort of inherent incompatibility with “slower” chargers. Some companies’ decisions to require a certain resistance between the data pins (which the adapter in the article you link to adds) to charge off a charger that doesn’t also provide an actual data connection has (as far as I know) no purpose other than to limit which standalone chargers work on a device. They’re already getting away with being difficult for no reason other than to make money on accessories, so I don’t see that changing anytime soon.

    It gets even more complicated when the EU agreement to move towards a common method of charging gets involved. My S II happily charges off generic chargers that my Galaxy Tab won’t charge off unless I use that adapter you linked to, because the S II is a phone and falls under that agreement. Apple is part of that too, which is why it offers a microUSB adapter in Europe. While I haven’t tried that adapter myself, I’m actually pretty sure that it would have to disable Apple’s own version of the “add resistors to make it charge” to make it do what the agreement demands. OFC that agreement is also the reason why my Galaxy Tab has a sturdy 30 pin connector that I love whereas my S II has a microUSB connector that looks like it would break if someone farted close to it. Essentially an agreement that aims to make chargers more universal has left me with two mobile devices from the same company with different charging connectors because one is a tablet, the other a phone. Go back a few years and other Samsung connectors start popping up on various MP3 players etc. All the while Apple is the one that gets bashed because it uses a proprietary connector, which it just happens to use on everything and have used for so long that finding one is no issue at all.

    So bottom line, “industry standard” and “proprietary” are words that doesn’t always have the positive and negative results that people think just from hearing them.

  • Avatar of Anonymous Bosch

    Current USB 3.0 allows up to 5 W and this year’s revision will be bumping it up to two to three digits. The former would allow the tablet to be charged at a decent clip if you aren’t using the tablet while it’s being charged; if you are, it’d be no different from now where using the new iPad while charging results in ridiculously long charging times. The latter figures would surely be enough.

    And the reason it would get rid of the proprietary charger problem is that the consortium has been getting more aggressive about companies using the USB trademark with this generation of USB. Stunts like Apple’s newer MacBooks outputting more than 2.5 watts, or ridiculous hacks that enable 10 W chargers that are labelled “USB” wouldn’t be allowed.

    This would also get rid of your problem with having two different connectors, since micro USB 3.0 would provide just as much power.

  • I see, I wasn’t aware that they were making the rules stricter. That’s great, and should surely help the situation. Question is when USB 3 becomes standard though, neither it nor SDXC is really new anymore but still don’t see much support for them :/

  • I have the same issue, as you stated, with my Galaxy Tab 7.0 Plus. The issue of standards and a bunch of techno-babble does not matter to consumers who run at a level of knowledge below most of us who frequent sites like Pocketables. The point is that capabilities that appear to be available on a Samsung device do not match expectations. And then the issue becomes those consumers not becoming repeat buyers. Which is what Samsung needs if it expects its initiatives in the cloud and its media store to gain them anything. The lack of brand cache and repeat buyers is why I question Samsung dumping money into proprietary ecosystem solutions. I do not expect to look at a Samsung for my next tablet or smartphone solution following my experiences with the 7.0 Plus and the Nexus S 4G. My Blackberry Playbook outputs to monitors over an adapter just fine, and Samsung is in a much better position financially to have implemented their solution more robustly than RIM was. Samsung needs to come up with better hooks to keep one-time buyers from looking elsewhere for their repeat buys. That is a matter of perception, right or wrong, and not something Samsung can turn around by standing on specs and standards.
    – Vr/Z..>>

  • I couldn’t agree more. Industry standards aside, the simple truth is that if you put a Samsung device next to one from Apple or some other manufacturer who doesn’t have this issue, you end up with one device that can do something the Samsung one can’t. Blaming it on industry standards is essentially the same as killing the idea of innovation on the basis that it isn’t standard. There may be a hundred good reasons why something is the way it is on a device, and a consumer might even understand that it is how it is, but at the end of the day, if another company doesn’t have the same issue, that company may just have gotten a new customer.

  • Avatar of vakeros

    I am not sure I am following the gist of Andreas’ complaint.
    Are you saying that the S II won’t output to HDMI at the correct resolution or are you saying that the piece of kit you have doesn’t work correctly with what the S II broadcasts? And is what the S II broadcasts the standard?
    What I am getting at (not owning an S II) but considering an S3 or whatever it will be called (Olympic!?!), is if you buy a different adapter then can you get the right result?
    I agree that Samsung could exceed the standard, as you suggest other companies do, but at the moment reading this it just seems an exercise in whingeing – how dare Samsung not support every configuration I choose to use even if it isn’t one they support, because xyz do!
    Because xyz do, can be a reason for buying xyz’s product instead of Samsung’s, but shouldn’t be something to complain about, unless Samsung aren’t actually offering what they stated they would.
    I hope you understand the differentiation I am making.

  • HDMI and DVI is essentially the same signal with a different connector and audio in case of HDMI. As such, a physical adapter between the two is all that you really need to turn one into the other, compared to e.g. VGA which is an analog signal and requires an active converter rather than an adapter. Because of this, HDMI and DVI really is interchangeable in a lot of ways. Samsung’s mobile devices, including both tabs and phones, are however not able to do this because of something I’m not even quite sure of, but I think it has to do with how the monitor identifies itself to the device. The result though is that when I start playing e.g. 1080p content on my S II, the DVI screen continues to display at the phone’s resolution, whereas an HDMI screen instead switches the screen to 1080p. Samsung devices are the only ones I’ve seen that do this, no matter if it’s a desktop PC with a dedicated control panel for video output or if it’s a $200 point and shoot camera that they threw in HDMI out capabilities on for the heck of it.

    I understand your basic point, but I don’t think it applies here. There are lots of “weird” types of video connections out there, for instance it’s possible to get S-video through VGA and VGA through DVI as long as the transmitting device supports that kind of non-standard use of the connectors. I’m not sitting here complaining that Samsung didn’t add a secret VGA signal option to its HDMI output so that you could get analog video out of it if you used the right connectors. I’m complaining that the device can’t handle a simple physical plug adapter that is such a “of course it will work” kind of deal that wikipedia doesn’t even mention the possibility that it won’t work
    (it mentions HDCP which is a copy protection thing and has nothing to do with this issue).

    Bottom line here is that Samsung has managed to add a layer of incompatibility to something that any other brand and the market itself treats as fully interchangeable (HDCP copy protection aside).

  • Hi Andreas,
    Your bottom line makes thiis clear for me. Presumably they haven’t actually added something to make it incompatible, but failed to add something which should make it compatible as a normal state of being. Is this true of the Note and has it been dealt with through the ICS update?

  • The issue is still there with ICS judging by a forum thread I found. I don’t know about the Note but I assume it’s there as it’s there on my Galaxy Tab 7.0 Plus, a device that technically is newer/same age as the Note

  • Avatar of Brian K. White

    I just discovered I have the same problem with my Sprint HTC EVO 4G LTE and a brand new AOC 27 monitor that has vga and dvi inputs. I couldn’t find any way to force either the phone or the monitor to the resolution that I know both the phone and the monitor do just fine. Both devices to full hd 1080p. The phone does it on my tv or anything else with a hdmi port. My laptop USING ITS HDMI PORT going to THE SAME HDMI-DVI ADAPTER on the monitor does 1080p.

    But for some reason the phone just does 480p on the monitor. I couldn’t find any app or root tweak that forces it either.


Leave a Reply

Your email address will not be published. Required fields are marked *