Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
The next best thing to OLED is getting cheaper (arstechnica.com)
125 points by rbanffy on Feb 21, 2022 | hide | past | favorite | 161 comments


Monitors are a depressing set of products. Every consumer has a slightly different set of priorities, and for me, the continuous cycle of compromises to arrive at a monitor that checks just over 50% of my requirements feels punitive. The lack of satisfaction is compounded further by the tour de force of reasonably priced and high-end display technologies packed into a sexy af piece of art hanging in my living room. Stepping from the living room into the office feels like a trek into the past.

I'm hopeful that QD-OLED will make that a less jarring transition. Mini-LEDs may help modernize the market, but I don't see them making a significant dent in my disappointment.


If you have the time and technical expertise of basic electronics (mainly soldering) and maybe 3d printing/basic CAD design, you can make your own monitor and that too at a decent price.

Step 1: Go to panelook.com (or directly https://www.panelook.com/modelsearch.php?op=resolution if you have a resolution in mind)

Step 2: Find the model number of your desired monitor and put it into your desired search engine (Google works)

Step 3: Buy the monitor WITH THE DRIVER* from AliExpress/Alibaba/TaoBao/eBay/wherever you get Google search results from

Step 4: Wait (???)

Step 5: Design a 3d model and print it, or use acrylic and cut it by hand, or whatever you want. Profit!

* - the display connects to the display driver. The driver is the one with an HDMI/DP/USB-C port.


This. 3 years ago I built a 1mm thick 4K monitor by just mounting a bare panel to a thin laptop stand and hiding the eDP to DP board underneath. It was better than anything on the market and cost literally 3/4 as much as a normal fat and ugly monitor.

I’ll just add “Message several sellers and ask for a datasheet and 1pc price+shipping from each one” to Step 3.


Let's see some pics


It doesn’t look as interesting as you’d expect. In fact it looks just like any other panel lol. These days I only use it to set up new machines in the data center: https://i.imgur.com/H7kww6U_d.webp?maxwidth=1520&fidelity=gr...

My current daily driver is the XB273K (27” 4K 120hz)


Cool, do you have a way and controls to set contrast, color, brightness, temperature, etc like in factory monitors via an OSD menu?


Yep! Just about any driver board that can convert between different input formats will also have an OSD.

I’m kinda surprised people have so many questions about this. I dug up the emails. I paid $176 incl shipping/tax to zjtechhk for a B156ZAN03.4 panel, eDP cable, and a MST9U11Q1 driver board (which has DP, HDMI, and USBC inputs, eDP output, and a ribbon cable to a PCB with 4 buttons to control the OSD)

It was an awesome deal at the time, but I’m pretty sure all these components are obsolete by now.


But does it have a terrible menu system, that takes 1 button press to change the color of gaming LEDs but 10 button presses to change the input? If not, it will never compete with commercial monitors.


Lol. The default keymap is indeed fairly intuitive since it hasn’t been customized by the marketing dept yet. Power, menu, up, down, input.


Lol uh, what's the don't touch label for?


It’s for factory workers that assemble laptops. Despite the label, it’s perfectly ok to touch it. There’s a flex PCB behind the label that drives the backlight and converts the eDP signal to pixels. It’s normally protected by the laptop bezel so it’s somewhat fragile. If you bend it, you’ll break traces on the PCB or pop components off.


Maybe I'm remembering old tech, but doesn't that also have thousands of volts in it?


You are! My first laptop ~15 years ago had a super thin (1mm?) fluorescent tube below the panel where the flex PCB would be. It used thousands of volts to start the backlight.

My panel (and anything from the last decade) uses LED backlights so I would not expect anything above 60V.


The problem is precisely the selection of appropriately sized "TV quality and price" panels to make a monitor with not the lack of physical construction of such panels into monitors.


I'm not sure I understand you. Are you saying that the problem is selecting the right model out of all the available choices, or are you saying that there aren't models available at TV quality/prices?

Regarding the former, the website does have quite a few filters (though the navigation isn't the best but it works).

For the latter, yes quality can be quite a hit-and-miss when purchasing but prices are very reasonable. There may be a ~10-15% markup compared to a large-scale razor-thin margin monitor being sold at nearly a loss on Amazon, but most products are quite reasonably priced.

Re: the quality, if you're talking about advanced features you may be able to contact the distributors/manufacturers on the website selling them and ask them to make a custom order with more features (eg an extra port) for a slight price, though this varies.


The latter. As you say, finding panels that exist is easy even for a layman so it follows out of over 100 million monitors shipped per year at least one manufacturer would be using said all around vastly superior panel and you wouldn't have to DIY it.

I've used this method for repurposing laptop screens but I've never found it a significant cost saver (even ignoring time/labor) vs standard prebuilt monitors nor is anyone making such a panel exclusively for DYI-ers.


It's certainly true that for high-end displays I'd just go to Dell/LG/Samsung etc instead of AliExpress. Heck, anything that's premium enough to cover beyond sRGB is likely to be rare on Ali. But (imo) the biggest benefits of low price and easy accessibility to the low-mid end of products is the benefit.

Not saying that you can't get those 8k panels LG uses - but - sending 4k USD to someone over AliExpress feels quite risky even to me. You could probably get good stuff if you put in the effort but if you're earnings decent money from your CS job (I suppose most here are, I'm still a student) then it'll probably be easier to buy OEM/brand name.


here's a video a youtuber showing how to do it, in case someone want a visual example. https://www.youtube.com/watch?v=DrqdHVeBkp4


They're usually called scaler boards, not drivers. Ones with LVDS outputs are common, but a lot of the new panels use VbyOne which seems to be rarer and more expensive to find a scaler for.


Thanks for sharing panelook; just spent 30 minutes browsing around. I love niche sites like this!


Thanks, I'm happy when I can share bits of trivia like this ^^


I've been going crazy and frustrated with the lack of HiDPI monitors on the market which can do integer scaling (27 inch 5K, 32 inch 8K) and here's a website that lists panels for such monitors.

I have no idea how to go about making a custom monitor but if it's feasible to solve my problem, I'm willing to learn.

So let's say I want to buy the LM270QQ1-SDA2, should I order that panel from an e-commerce website? Or should I find a monitor with that specific panel? Why is 3d printing needed here? Any article or detailed guide you can point me to?


I’d recommend buying the panel, eDP cable, and driver board from the same seller. I posted my experience doing the same above. Of course you could take a chance with random components and it’ll probably work as long as the connectors match, but why bother when Chinese vendors are willing to do the legwork for you?

You don’t need a 3D printer if you have another way to hold the panel (I glued it to a thin laptop stand)


Got it, so I should get the panel, the driver, and an eDP cable (to connect the driver and the panel?) from a single seller.

I assume that the use of 3D printer was mentioned to build an enclosure for the panel, like for using a VESA mount?


Yes exactly. You need something to attach the panel to (unless you've got magic to float it in mid air lol). You can repurpose an old canvas/acrylic board, or go fancy with a custom 3d printed case.

Thing is, you could glue the panel to something as people have done. I find that detestable because you can't remove the panel non-distructably. A proper enclosure can hold the panel instead of sticking to it.


Sounds like it could be a good business. Custom monitors, no Alexa or spyware. Choose the specs you care about.


Quite a sign of the times when "no spyware" is now a valuable and rare differentiator.


What are your priorities? A lot of pro photographers I know use iMacs because they are a cheap way to get an adequate computer attached to a monitor with high resolution and OK (for the price) color space


This is currently challenging because the 5K iMac is still on Intel which I’d be hesitant to buy into at this point. Hopefully resolved soon but we don’t know exactly when or what form it will take (e.g. will the 5K iMac be an “iMac Pro” and have a higher price than the Intel version?).


System agnostic (not an all-in-one), <35", HDR, OLED or OLED-backlight, >144hz, 1440p+ or higher, G-sync, 16:9, >=99% DCI-P3, doesn't look like a F-117 fucked a ferris wheel at an EDM festival, <$2k.

Prioritized in that order.


I'd be curious as to the differing priorities you are thinking about.

For me, I want real estate. At this point I'm looking forward to 8k in a 50-55" TV, good DPI, and 60Hz. Not a twitch gamer, I'm a developer. I use TVs.

Gamers want response time/Hz, decent appearance. They are the prime target of "Monitors"

Then there's the professional editors and the like. They used to be high-end monitors, but I think high-end 8k TVs will serve them as well.

What else is there?


What I am looking for is a large (at least 37.5") ultrawide OLED (or something with comparable contrast) with at least 3840x1600 pixels and a >= 120 Hz refresh rate. Basically something that is a straight upgrade from my current monitor [0] that improves the contrast without compromising on other factors (size and resolution are hard requirements, refresh rate, pixel response and color gamut are negotiable as long as they are good enough, brightness I don't care - have the current one set to 10%). Oh, it also needs to support FreeSync, but that seems to be less of a problem these days.

There are no panels that fulfil that at the moment. The nearest option woulb be getting a 4K OLED and then just not using part of the panel but that is hardly ideal.

[0] https://www.lg.com/us/monitors/lg-38gn950-b-gaming-monitor


Sure you’re not a gamer, but have you tried using a 120hz/240hz monitor for a week? I think you’ll find that it’s almost as massive an upgrade as going from 1080p to 4K.

Going back to 60hz is painful. You’ll see the cursor moving in a rotating square pattern when you’re moving your mouse in a circle. The lag is palpable.

> What else is there?

Well HDR is nice to have.


I've been using a 144Hz screen for two years, I am a gamer, and I'll be honest: I don't notice much difference between consistent 60Hz and 144Hz. And I'm fairly sensitive to frame rate - when I'm watching a movie at someone else's house, I can tell in seconds they have the garbage "smoothing" or whatever features enabled.

Things that do stand out to me are input lag (again, usually only from TV-as-monitor with bad settings), bad colour space, and occasionally bad grey-to-grey time. I would take a 60Hz monitor over a 144Hz one if it meant avoiding any one of these issues. In a heartbeat.


one of the jarring things about using a TV as a monitor is that some TV models don't have the option of disabling the image processing that blurs the text so you end up with blurry text that gives headaches. I remember encountering this problem in the past with a specific samsung TV model where the processing was only disabled for the VGA port but not the rest of the ports (it was hard coded and couldn't be disabled).


Not the GP, but my current dream monitor would be a 3:2 or 16:10 OLED in the 24"-27" range with roughly 200 PPI and 120 Hz, preferably slightly curved, with hardware calibration for at least sRGB gamut. There’s nothing close to that in the market.


sRGB is a very modest goal.

I just got an RGB OLED laptop with a gamut significantly wider than Display P3. It's just glorious. UHD content like 4K movies just pop in a way that you have to see in person. It's especially noticeable on military uniforms, where the various shades of dark green are much more distinct than on a typical monitor.


My priority is color accuracy, via hardware calibration (LUT) (no loss of gradations by OS-level or GPU-level mappings). I’d rather have an accurate sRGB display than a not-quite-accurate P3 (or, worse, "natural" wide gamut) display. Also, to display sRGB images (still the large majority of what’s out there) accurately on a wide-gamut system, you need 10-bit color depth at the OS/GPU level to not lose/distort gradations.


Most wide gamut displays are 10-bit per channel, which makes them accurate enough even with software calibration.

Most also have 14-bit LUTs in hardware.


It's not sufficient for the display to be 10-bit, the OS and/or GPU (where the software calibration mapping takes place) must also work with 10 bits, and when graphics from different color spaces are combined on screen (UI graphics, images displayed, etc.), the OS must correctly map the source color space to the 10-bit output color space. All of that working correctly is not common-place yet.

Therefore, for dev work and dev-related UI graphics, I prefer to work in a calibrated "least common denominator" 8-bit sRGB space, because that's much easier to get right. However, in order to not lose color gradations to calibration, hardware calibration is then preferable.


It is common-place-ish.

Windows since Vista can use 16-bit float buffers for the desktop manager. Some applications support this too for all controls and UI elements. Desktop graphics applications support 10-bit, such as Photoshop. Similarly, video playback is generally 10-bit.

In the past, this feature was reserved for the ludicrously expensive "professional" GPUs like the Quadro series, but it has been enabled in software for the mainstream AMD and NVIDIA GPUs. Very recently (just months ago?) my Intel GPU gained 10-bit output capability even in SDR mode.

It definitely works, I used "test pattern" videos and test images in Photoshop, and even dark grey-to-grey gradients are silky smooth on two of my monitors. This includes a 7-year-old Dell monitor!


What are your requirements? I had no issue finding one that met all of mine at a reasonable price a few years ago. 144Hz, low latency, decent size, 2k resolution.


The solution is simple: use the “sexy af” piece of art in your office.

I’ve used one for 4 years. The only compromise is I have to power it on/off by remote.


I've considered it, but can't play FPS effectively on anything over 34" due to biology and $ per sqft of real-estate. The seating position to keep the game in focus would be half way across the room.

OLEDs under 40" aren't TVs and almost all OLED monitors target revenue generating use-cases and are priced to match.


LG sells several 48" OLED TVs. To get the equivalent of 30" distance to a 27" monitor (about 48 degrees FOV), you need to be about 4.5 feet away from a 48" screen.

I just mounted mine to the wall, and the back of the desk is about a foot off the wall.


LG is also introducing a 42” OLED this year - which I am very tempted by [0].

[0] - https://tftcentral.co.uk/news/lg-c2-oled-tv-line-up-for-2022...


I have the Gigabyte FO48U, which uses the same panel as the 48" LG C1, and... it's a beautiful TV. That's what I wound up using it for, because I couldn't stand it as a monitor and went back to my old 27". Image retention is still a real factor for computer use, and maximizing a mostly-white window results in this incredibly jarring transition from a screen that's almost uncomfortably bright to one that's too dim.


Playing FPS is office use? Are you a game developer? :-)


What brand/model are you using?


I have a 49" curved Samsung 8500. It isn't the newest technology anymore.

When a curved 42"-50" 16:9 OLED comes out, I'll be the first in line.


I really enjoy the 49" ultra-wide curved form factor Samsung has pioneered and is on the second generation of.


Mini LED has a marketing issue in my opinion. Afaict it is a rebranding of FALD. Rumors several years ago made it seem like "mini LED" would be LCD with per-pixel backlighting. "Mini LED" turned out to be a nothingburger. We have nearly the same number of dimming zones as we did 5 years ago. It's woefully insufficient.

The real story in monitors this week was the alienware QD-OLED 32" ultrawide (ugh) curved (ugh) for only $1300.


Eh, the only "ugh" thing about that new Alienware monitor, to me, is the low resolution - 1440p at 27"/34"? No thank you, 2160p or better + fractional scaling just looks so much better for those of us editing text and/or code all day.


It's probably more geared towards the gaming market then. 1440p/27 inch is a sweet spot for gaming because you can actually drive that resolution at a decently high fps. Even the latest (if you can get your hands on then) gfx cards still struggle at 2160p for upwards of 100fps, whereas you can hit that in most games at 1440p even on several years old hardware now. For 34 inch I think 1440p will start to look a bit cramped, but at least it's not the usual 2560x1440, it will be 3440x1440.


Just to clarify, 34” 21:9 is the same height (and pixel density) as 27” 16:9 - and it’s definitely just on the cusp of noticeably not-pixel-dense when editing text, but I can see how it would be ideal for PC gaming where you sit a bit further back.


this right here. I went from a 1080p 24” @144Hz to a 1440p 27” @160Hz and it’s night and day. It was in that perfect sweet spot for size too, not just performance and quality. The bigger you go the further you have to sit from the screen, but 27” is just right.


200% scaling might be fine, but fractional scaling still isn't well supported on Linux. I'd prefer such a monitor, yes, if it wasn't going to limit what systems I can run.


That's totally fair! I'm still waiting on Linux to get its act together with fractional scaling, but until then, Mac works fantastic for my work with its fancy brand of fractional scaling w/ supersampling and better font rendering.


Hint hint, monitor manufacturers. Maybe your target market might become larger through some free software contributions.


Linux desktop has <2% market share. A monitor manufacturer would have to be suicidal to try fixing a platform that doesn’t even have proper support for basic features like display scaling.


1% could be the difference between breakeven and loss. I am a Linux user with 2 "low" resolution monitors, because I don't trust software support for higher resolutions, but admittedly also because 1080p is fine for me.


That’s fair, but with consumer electronics you need to sell at 2.5x BoM cost. Breakeven is death.

It’s worth mentioning that display scaling was one of many dealbreakers that made desktop Linux unusable for me: https://news.ycombinator.com/item?id=28490753

I’m pessimistic about any of this getting fixed in the next few years.


1440p at 27" still seems like the sweet spot to me. Image looks crisp, I don't have to turn on any janky DPI scaling features, and games actually run well.

That Alienware monitor is clearly targeted for gaming. Regular 4k is already difficult enough to achieve in AAA games, expanding that to an ultrawide aspect ratio would only make getting acceptable performance harder.


> Rumors several years ago made it seem like "mini LED" would be LCD with per-pixel backlighting

That has since become “micro LED”. Samsung makes a micro LED TV, but it costs six figures, and is only available in something like 80” and above.

When it’s finally affordable it will almost certainly unseat OLED as quality king though. All the upsides with no burn in or peak brightness issues.


Seems like the transition is from OLED to QD-OLED to micro LED. Mini-LED seems to really have no place to live here, as far as I can tell.


Possibly current OLED burn in within a decade. I don't buy for family TV that don't be replaced frequently. MiniLED is suitable offering for them. Micro LED TV is a future product for home, maybe takes over a decade to be shipped.


It looks like QD-OLED with a heatsink is actually better than MicroLED in almost every way. The MicroLED hype "forgot" to mention that it consumes kilowatts of power.


Micro LED is completely different to LED LCD TV. It use tremendous amount of RGB LED to make display, without LCD. Mini LED is improved FALD LCD.


The "per-pixel" one is now called micro led


>Rumors several years ago made it seem like "mini LED" would be LCD with per-pixel backlighting.

It never did. Not on any reputable rumours site and as far I am aware not even on non-reputable ones like wccfTech.

>Afaict it is a rebranding of FALD. We have nearly the same number of dimming zones as we did 5 years ago.

You will need to find me a 500 local dimming zone in a 27" Monitor 5 years ago. I can assure you there is none.


> made it seem like "mini LED" would be LCD with per-pixel backlighting.

I thought that was microled? Which does all that and seems like a strict upgrade over oled. However, there are currently just about 4 models on the market and starting prices are at 100k.


For me that's the perfect gaming monitor. Ultrawide is just chef kiss and the resolution is high enough to be sharp but not so high that you need a 3090 to run it.


Mini LED is FALD, but the LED parts are physically smaller. Roku's current FALD TVs top out at 80 zones. This monitor has over 500.


I did a ton of research a couple of months ago to find a good 4k monitor mostly for programming (a 4k monitor has been seriously worth it!), and occasional gaming.

I ended up with the LG 32UL500-W (https://www.rtings.com/monitor/reviews/lg/32ul500-w), for about $300 US.

My conclusion was that trying to get a higher refresh rate 4k monitor, or even higher resolution, or OLED or anything else, made it 2-6x more expensive. Prices for these features still have a _ways_ to go before they're affordable.


I've ended up with this one - https://www.rtings.com/monitor/reviews/benq/ew3270u

Which reviews almost identically apparently. Great for dev work, fine for games. There still doesn't seem to be anything massively better than these two in a 32" 4k screen, without spending multiples of the price


You don't find the lack of brightness to be an issue? Both of these monitors only hit 280 nits.

I do find that the intersection of size, resolution, and brightness is nearly empty. I have a pair of U2720Qs, but would love something more in the 500-600 nits range or even 1000 that isn't the Apple XDR. LG has promised a new 40wp95c for over a year now (and amusingly reannounced it at CES this year, since they didn't ship at all in 2021) which maybe works but at $2000 for a monitor, it's not for everyone (and it came in at a much lower brightness than people expected).


Not really, it seems bright enough to me. I have bright Australian sunshine coming in the window of my office and I can still see it absolutely fine (the sun does not hit the screen directly, I will add).

Doesn't blasting more nits at your eyeballs cause more strain? I'm asking this out of ignorance, I have no real idea, a display that's too dim seems likely to cause this as well.

I can definitely see more brightness being great for gaming and media consumption/creation. But for coding? Not sure I'm there, and I'm certainly not looking to spend $2k on a screen, when that can buy a pretty good, much larger TV!


Lack of brightness? The monitor is pretty much _too_ bright as it is. I use around 30-40% brightness and never feel the need to adjust it. An even brighter monitor seems alien to me!


You’re OK with VA instead of IPS?


Could someone explain to me why we have OLED smartphones (which are awesome) and OLED TVs (which are awesome), but no OLED monitors?


OLED monitors do exist, just at a very much unreasonable price and size:

- https://www.lg.com/us/monitors/lg-32ep950-b-oled-monitor

- https://smile.amazon.com/dp/B07XTZ45T2 (this one is more of a laptop screen than a monitor)

I’m simplifying here, but essentially no one has figured out how to make OLED panels affordable at a reasonable form factor for monitors yet. The type of panels used in phones (and tablets/laptops) are very different from those used in TVs. LG is making inroads with their 42 C2s, but those will also have a high price and is still way too big for normal monitor use cases.


There's a 34" ultrawide Alienware QD-OLED (Samsung's new OLED tech for TV panels) coming this year for $1300, which is still expensive but nowhere near as expensive as monitor OLEDs have been.


I seem to recall smartphone OLED screens being made by one company and tv sized OLED screens being made by another, with the smartphone OLEDs being better with burn-in but very difficult to scale up


Yeah, Samsung makes most of the smartphone OLEDs and LG makes the TV OLEDs.


There are no high PPI TV OLED that fits the monitor market. TV OLED panel ( or WOLED by LG ) only gets down to ~40" TV size and with 4K resolution. Meaning you get sub 2K resolution in 20-30" size.

Smartphone OLED by Samsung or BOE or other maker are too expensive to scale to Computer monitor size.

And burn in. OLED just isn't well suited for computer monitor usage. ( Apart from QD-OLED )


Wait, 40" @ 4k is exactly what I want. Where can I get one?


My understanding: They still can't get big enough sheets without defects. The bigger the end product, the more failed sheets have to be thrown out or cut down for smaller displays, and the price rises rapidly because double the diagonal means 4x the pixels.


They can make sheets large enough for TVs and both TVs and phones do come with high enough pixel counts for monitors. You don't need phone pixel densities on their monitor but TV pixel densities might be too low.


Lots of people game on the LG C1. 55in is $1300ish.


Will this be using PWM like almost all OLED screens for dimming? I feel that the answer is probably yes and it's quite frustrating.

But my wants are probably a bit different than everyone else's: ideally a touch (w/ pen input), high resolution (integer scaling), matte monitor that uses DC dimming.

I've lost hope on such a thing existing (closest thing I have seen is the latest thinkpad yoga x1, gen6 I think, I'd be into buying just the screen, but not really possible). Instead I'm hoping for better e-ink panels which will be even better by eliminating backlight completely.


What is the matter with PWM dimming? Would it be resolved by using a higher frequency for PWM? The main reason you use PWM for dimming LEDs is because DC dimming (or filtered PWM) causes color shift. By using PWM, you can drive the LEDs with the same current, but effectively switching them on and off. Driving LEDs at different currents produce slightly different color outputs.


Do you get touch (with or without digitiser) and matte? I don’t recall ever having seen the combination outside of e-paper devices like reMarkable, though it certainly could exist and I just haven’t seen it. I know there are after-market solutions for things like iPads where you stick an extra sheet on top, screen protector style. I have no idea how pleasing they are.


Look at thinkpad x1 yoga gen 6 for matte with touch & pen input. And other non yoga thinkpads have had matte with a touchscreen.


A PWM with a capacitor and a load is an adjustable DC supply. What problems are you having with it?


I may be an edge case, but after getting a concussion, all my screens need to be higher refresh rate. 144hz works; my phone is 90Hz and it's borderline. But more importantly they all use DC dimming. Had to sell a laptop with PWM dimming.

I suspect others are just sensitive to it (sans concussion) and prefer DC dimming.


PWM+capacitor is not an adjustable DC supply. You also need an inductor to do that: https://electronics.stackexchange.com/a/31714/15654

Of course the wires have resistance so it would technically “work” as an RC filtered PWM until it overheats.

PWM dimming means turning the LEDs on and off without any smoothing.


>Will this be using PWM like almost all OLED screens for dimming?

But the article is about Mini LED LCD?


Yeah sorry. Most LCDs, at least up to a couple of years ago, use PWM for dimming & all OLED do AFAIK.

I don't know much, to know if mini LEDs dimming in different sections mean PWM, but I suppose they might since it would be probably easier to change the speed of the rapid on&offs of some LEDs rather than the current to dim part of the screen.


I hate the way the term LED was usurped by monitor makers. I would expect a LED monitor to have one LED per pixel, not use a single LED to backlight a whole bunch of normal LCD pixels.


LED self emissive is called microLED.


Well, it was to distinguish them from the old power-hungry, thicker (and often buzzing) CCFL backlights.


They could have invented a better term. It's almost like they wanted to confuse people.


Weird article. Burn in is never mentioned. I can buy OLED displays if burn in is no longer concern for PC monitor. I wish it lasts at least 5 years without care. I'll buy even if price is high. I don't want to too much care about burn in, and throw away it every two years.

For TV, I don't replace it frequently so I hope it lasts a decade.


At least one of the recent OLED monitors (I forgot which one, maybe the LG 32") has a physical resolution of 4K plus an extra width/height of 16 (?) pixels, and uses slow pixel shift to slightly move the image around to prevent/reduce burn-in. That strikes me as a good approach.


Looks great. PC monitors should be made like this.


>Burn in is never mentioned.

This is about Mini LED LCD Display, why would burn-in be mentioned? Or you want it to be stated as an advantage?


It's definitely a pros for MiniLED Display compared to OLED. I expect a mention for comparing OLED.


I had a hard time with that article because it kept swinging between two very different display technologies. OLED and Mini LED are not related and the monitor they are talking about is Mini LED, so why even talk about OLED?


Probably because both displays are associated with deep blacks. Mini LED has halo issues but it’s the obvious comparison.


Mini LED, but it's rather hamstrung by only being 2560×1440.


This. I've been using multiple 1440p displays since 2010, and actually had a 2560x1600 LED back in 2004.

I'd do anything for 5K, 120Hz, Mini-LED or Micro-LED displays with 10-bit color and HDR. I don't want 4K: 5K has over 77% more logical area.

They don't have to be cheap. I'll pay well. I just want them to exist. I've been waiting over a decade at this point!


my favourite thing about 5K is that you can do 2x scaling and still have plenty of space on your screen to display multiple windows (effectively 2560x1440)

but with a 4K screen, you have three options: 1. no scaling and tiny fonts (ugh), 2. fractional scaling (ugh), or 3. 2x scaling and settle for 1080p real estate (ugh)

IMO 5K > 1440p > 4K > 1080p


Can someone explain why scaling a 4K screen by 200% is better than a monitor with 1080p native resolution? Or in general why is it better to use a scaled-up high res screen instead of a natively lower res screen? Doesn't the scaling get rid of any gains of being higher res? I just can't find an answer to this simple question online.


Assuming your entire software stack supports it, the advantage is that stuff can get displayed with twice the resolution but at the same physical size. Your operating system probably already does this with fonts and vector graphics. It can also be done with faster images too if they are provided at 2x resolution.


With 2x scaling, fonts are really crisp. Regular 1080p gets there most of the way, but with 2x scaling on a super high res display, the difference is noticeable and it's hard to go back.

Have you ever switched from a high refresh rate screens to a 60 Hz screen and noticed a difference? I find this to be similar.


I've been waiting for basically the same thing in 3:2 or 16:10 although I'd settle for 4K, given real-world constraints. For what it's worth, that's a tremendous amount of bandwidth that even _new_ protocols such as HDMI 2.1 or DisplayPort 2.0 don't support, except DP2.0 in it's maximum bandwidth mode that hasn't been implemented anywhere as far as I know. If you take your requirements down to 4K, then both HDMI 2.1 and DisplayPort 2.0 at UHBR10 can do it.


There’s the LG Ultrafine 5k monitor, but I suppose it doesn’t have HDR or 10-bit color?


Neither. It came out in 2014, with some revisions since. 8 years later, there's just not much else out there in 5K-land.

The products basically don't exist.

I can get some rather fantastic 1440p displays -- HDR, 165Hz, DCI-P3 color gamut, and excellent color reproduction (especially when calibrated) for as cheap as $260 on sale. Best thing imaginable? No, but amazing for the price, and objectively better in every measure than the 1440p displays that were $1000+ in 2010.

I'd pay $2000 -- perhaps even $2500 -- each (and I'd buy THREE), for a 5K display that met my requirements.

-- -----

Hardware Unboxed recently made a video complaining that the display market is too heavily driven by gamers wanting a specific, narrow feature set for a rock bottom price.

Display makers have delivered that. However they don't really make much of anything else as a result.

It's a nasty, difficult, low-margin business where almost no one is making money, hence why it's huge volumes of largely the same thing with no major shifts forward.


Seems like there have been advances, just not in pixel density beyond 160 PPI.

The bigger issue than low margins is the "4k" marketing and lack of large buyers other than Apple for 5k displays. I think early on in the 4k hype train, there was a bandwidth issue with Display port that required MST for 5k.


It’s fantastic, though. And it’s hard to live with less than 500 nits once you’ve used it. 5k is where it’s at.


So basically the Apple Pro Monitor but with 120Hz?

Unfortunately 200+PPI are extremely expensive. So 5K panel at 27"+ are partially speaking an Apple only SKUs.


> I don't want 4K: 5K has over 77% more logical area.

Can you elaborate?

5/4 = 1.25


(5120 * 2880) / (3840 * 2160) = 1.77

Since they are both the same aspect ratio (16:9) you can also take the ratio of one linear dimension and square it to get the area ratio.

So:

(5120/3840)^2 = (2880/2160)^2 = 1.777


https://en.wikipedia.org/wiki/5K_resolution

4k is 4096 x 2160, 5k is 5120 x 2880

There's also a really helpful diagram on that page


It's cinematic 4k. 4k monitor is 3840 x 2160.


5:4 is only one dimension, but 5k screens have greater resolution in both dimensions. I make it roughly 66% more area:

(5120 × 2880) / (4096 × 2160) = 1.6666


It comes out to 77% if you use the SMPTE UHDTV standard of 4k, which is (2 * 1920) × (2 * 1080) = 3840 × 2160


Ah. I just checked, and that is indeed the resolution of the 4k monitor I'm currently using. So I would guess that standard is pretty common!


That’s also why 5K monitors are more than 2X the price of 4K ones at similar sizes. I have a 4K with those dimensions that I bought for $400, an LG 5K would cost at least $1200.


I think the price is more about where on the adoption curve those technologies are. 4k is pretty mainstream these days, older tech and produced in high volumes: ergo cheaper. 5k is still pretty new, mainly constrained to higher end products, and sales volumes are lower: ergo more expensive.


5K 27" monitors predate 4K 28" ones, but the supply was basically monopolized by Apple from day one. I can get an affordable 5K monitor, but unfortunately I have to get an iMac CPU unit to go with it.


And they took away target mode so your only compute option is the processor that came with it. That still annoys me, I’d like to drive my perfectly great 5K display with an M1 Mac mini.


For macOS users like me, 4K on 27" is weird. Running it at 1x dpi would make everything too small, running it at 2x "retina" dpi would make everything too huge. And macOS can't do non-integer UI scaling nicely. Apple uses the term "5K" when describing its iMacs with 27" retina-1440p displays, but monitors with that resolution are very rare it seems.


It's not perfect, but it's far better than 1440p on 27" - which is either super tiny or super blurry for my eyes.

The way I'm running 4k 27" on Mac is setting scaling to 2x ("looks like 1080p) - which gives the crispest UI, and set scaling appropriately in a small set of applications so that they are not too huge. Just running the browser at 90% default scaling makes things pretty decent. And then maybe reducing the font size by 1pt on one or two text editors.


Yeah, Linux has similar issues with non-integer scaling. I'm sticking with 1440p for now as 4k at 1x for a desktop monitor is too small and 4k at 2x is sacrificing too much screen space, so I just do 1440p at 1x. 5k at 2x would be similar but with the advantages in text/etc sharpness.


>Yeah, Linux has similar issues with non-integer scaling.

Wayland does not have those issues and is very nice in my experience although I hear that screen sharing is still a work in progress.

I mean apps that talk directly to the Wayland protocol without XWayland in between, which includes Firefox, Chrome, graphical Emacs, all the Gnome apps.


Wayland doesn't solve fractional scaling. It doesn't support fractional scaling and instead renders surfaces at 2x and scales down, resulting in a fuzzy image with higher resource consumption. It's better than nothing, especially compared to the extremely backwards situation of XWayland apps, but it is very far from where it should be [1].

Even if Wayland were to support fractional scaling: Gtk doesn't support fractional scaling at the toolkit level - integer scaling only [2]. Although it's worth noting Qt and Electron could probably support fractional scaling today. The general philosophy being pushed on the Linux desktop is to do scaling like macOS, but unfortunately it combined that bad philosophy with a worse implementation.

Wayland is a blessing for the Linux desktop, but sadly the free desktop is still waiting to be dragged into the 21st century in many respects.

[1]: https://www.reddit.com/r/kde/comments/lficfe/wayland_fractio...

[2]: https://gitlab.freedesktop.org/wayland/wayland-protocols/-/i...


>Gtk doesn't support fractional scaling at the toolkit level - integer scaling only [2].

Your reference [2] is on the topic of Wayland, not GTK. Although the referenced page does contains 2 occurences of "GTK", after reading the context of those occurences, I am unable to find or infer anything in support of your assertion that GTK doesn't support fractional scaling.

Also, your reference [2] implies that there are two scaling factors. Specifically at the top of your ref [2] is a link to another page (issue #34), in which a participant, "Pekka Paalanen, Maintainer", writes,

>Fractional scaling is a whole another topic apart from UI scale. I want to mention it to avoid confusing the two. . . . This is different from UI scale, because it is effectively a fractional buffer scale, without changing the UI size.

The title of your ref [2] is "Add support for fractional scaling", which was written 9 months ago, at which time I was already using fractional scaling (what I mean by fractional scaling and the meaning that is relevant in the current thread of conversation) in released software (namely Gnome 40 on Fedora 34) -- which is another sign that you've incorporated information that is correct in one specific context, but you've erroneously applied it to a different context.

Skipping forward in your ref [2] by 3 sentences, we read, "The current Wayland protocol allows clients to scale their buffers by an integer scale factor" (emphasis mine).

But most of the elements -- particularly 99.9% of the text -- on a modern operating system are stored on the computer as mathematical descriptions of curves. The big exceptions to this are JPEG and PNG files (hence the interest in SVG). (Another exception is "bitmap" fonts, but IIUC the only people looking at bitmap fonts these years are techies who have specially configured their OS or their apps to use bitmap fonts.) These mathetical descriptions have no native resolution: they can be rendered cleanly at any scaling factor.

I use Gnome Settings to tell my computer how large I want things to be, then the text is rendered at that particular scaling factor. There is never any moment in which any framebuffer or rectangular region intended to be displayed on the screen is scaled (again excepting JPGs and PNGs). I know this because I have not been using a HiDPI monitor, so I can see the individual pixels in my monitor. There is no way for the text on my computer to look as good as it does if it were being rendered into a buffer, followed by that buffer's being scaled. I would be able to tell the difference. Unless you want to try to tell me that changing the text size in Terminal.app on a Mac to some size other than the default size causes text to be rendered into a buffer, then the buffer gets scaled before being shown to the user. Or when in Google Chrome, I use the "hamburger menu" to set the "Zoom" of a web site to something other than 100%, that web page is being rendered, then scaled.

Note that Google Chrome and Firefox do their own text rendering: they do not rely on the host OS to do it like most apps do. Consequently, exploring the Zoom setting in Chrome is a good way to experience what scaling on a pure-GTK3 Linux install is like, only that the scaling factor on Linux applies to everything on the screen (including the mouse cursor) whereas in Chrome it only applies to the viewport (minus the mouse cursor).

Again, IMO the people in your reference [2] are talking about a technical detail different from the detail you think they are. Maybe that detail is relevant when you have 2 monitors of different native resolutions and you drag a window from one monitor to another. But that is not the topic of conversation here. Here we are talking about users with a single monitor who because they didn't do enough research before buying the monitor, are faced with a choice of the text's being too big or too small. That can happen on MacOS, but will never happened on a pure-GTK3 Linux install. Pick whatever native resolution, monitor size or pixel density you want: the Linux install can be adjusted so that things are not too big and not too small. (And it takes only 2 seconds to switch sizes: any windows that are open when you switch automatically adapt.) Specifically, on a monitor with 1680 horizontal pixels, Gnome Settings is giving me the following choices for the scaling factor: 100%, 125%, 150%, 175%, 200%. My guess is that if my monitor had more horizontal pixels, I would get more choices.

System Preferences on a Mac (specifically the Display ("Displays"?) pane) also gives you some choices that make everything bigger or smaller, but if you don't choose the native resolution of the display, everything is very blurry -- which does not happen on a pure-GTK3 Linux install.

ADDED. you have to run the following command (once per install) to activate the "Scale" line in the Display pane of Gnome Settings:

  gsettings set org.gnome.mutter experimental-features "['scale-monitor-framebuffer']"


Well, it's all fine if it works for you, but there's no doubt that both Gtk and Wayland do not support fractional scaling at all: https://gitlab.gnome.org/GNOME/gtk/-/issues/4345.

> System Preferences on a Mac (specifically the Display ("Displays"?) pane) also gives you some choices that make everything bigger or smaller, but if you don't choose the native resolution of the display, everything is very blurry -- which does not happen on a pure-GTK3 Linux install.

It absolutely does. You may not see it, but Wayland and Gtk as of 2022 are not able to do technically anything else here than scaling. This is a known issue, and the reason why I linked that Wayland fractional scaling thing - it is not possible on regular desktop Linux (it is on Android/ChromeOS) to get adequately sized, rendered for size output on your 27" 4K screen or 13" 1080p screen.


I agree that 5K 27" monitors with 2x scaling would be perfect for macOS, but I still like 4K 27" with 2x scaling. It's only slightly larger if the monitor is placed at arm's length.


I just discovered the same dreadful thing with 27" 4K monitors, and found this site which explains what Macs are doing:

https://bjango.com/articles/macexternaldisplays/

TLDR: use monitors that are around 110 or 220 PPI. The 27" 4K is right in the middle at 163 PPI.


I have a ~110 PPI monitor (the currently ever so popular 2560×1440 at 27"), and the first thing I'd change about this would be to have it be around 90-95 PPI like all the monitors I've had since the 90s. There's not a single website I use regularly that I wouldn't have zoomed in just a little bit because all the text is just a tiny bit too small to comfortably read (I spend far too much time in front of a monitor to have it any closer to me). The smaller pixel pitch in turn seems to offer no added value at all. (At home I have my old 1920×1200/24" display as a secondary, and all the default-sized content just seem to agree better with that combo.)

I'd gladly take a monitor like this at 2× the resolution of course (so 5K), but again, if I could choose, not at 27" but at 30".


For me even that is too much :-P. I have a 27" 2560x1440 monitor and i consider it too large.

At least when i upscale older (or very demanding) games from 640x480 or 1280x720 they look crisp enough instead of the blurry mess that was before a few years when this became possible outside of emulation.

But i bought it because of the VA panel, 165Hz refresh rate and flat surface since i couldn't find any smaller monitor with those characteristics. So if that next-best-thing-to-OLED tech is actually good, i'll most likely get one since i doubt i'll see a real OLED PC monitor in not-gargantuan sizes.


"Cooler Master also announced a 4K 160 Hz version of this monitor, the GP27-FUS. This device is also cheaper than other mini LED monitors. With similar specs to the GP27-FQS, save for a bump to HDMI 2.1, the monitor will cost $1,100 when it debuts alongside its lower-res sibling."


The price for a modest bump in specs feels absurd. There's plenty of really decent 1440p144 monitors, but you want FALD (full-array local dimming)? $300 just became $600+. Similarly a ~$600 nearly doubles in prices when you add FALD. This tech commands such an unreasonably vast premium, for not much added build cost.


2.7k for gaming is not particularly hamstrung.


and, for it's gaming target audience, much more hamstrung by being 60hz.


miniLED is not that great compared to OLEDs. The real innovation is getting quantum dot OLEDs to be much cheaper on average. They're brighter than traditional OLEDs and burn in more evenly (if they ever do) so you don't see the burn in of specific sections of the screen (like a taskbar) like with traditional OLEDs either.


This HDR obsession makes me want to abandon color entirely and switch to E-Ink.


The problem with e-ink is no so much the poor gamut as the low speed of switching the pixels.


Technology connections had a video showing a large e-ink playing video, and it was 3-5FPS, so unless your use case is "videos and games" e-ink is already plenty fast.

It's even faster if you let it manage its own pixels and only update parts of the screen that need to be redrawn. Can't play videos like that but scrolling a spreadsheet becomes much faster.


Wouldn't typing also just be horrendous on 3-5hz displays? I think the only thing that would be good on an e-ink display would be reading static text.


> scrolling a spreadsheet becomes much faster.

Have you actually tried? Do you have a video of this? Because as far as I know, and I actually work with electrophoretics, physics limits the speed of the ink. I hope you're not confusing something like A2 mode with actual display updates.


> The problem with e-ink is no so much the poor gamut

I don't understand what you mean by "poor gamut". Gamut of grayscale?



How do you have a problem with the gamut of a still-in-research technology that isn't even available as prototypes let alone commercially avialable? Are you confusing that with Kaleido?


I'm still annoyed that I can't buy an OLED panel anywhere between 128x128 and full HD resolutions. The HD ones are too big and difficult to interface with and 128x128 is too small to make an interface on.


Is there a current display technology that doesn't doesn't implement variable brightness using pulse width modulation?


I'm not buying any TV/monitor until Micro LED is available (each subpixel is an LED). Everything I have (LED backlit) is good enough - and I'll just sit out the OLED and Mini LED (LED backlit zones) generation. It'll probably be a few more years but I think it'll be worth it!


>It'll probably be a few more years

Even Samsung QNED is at leat 3-4 years out. MicroLED is definitely further out than QNED.


Honestly if I were to do it over again, I'd go for 1080p, or maybe 1440p for a bigger monitor. When it comes to optimizing graphics usage more frames ovee resolution tends to be where I gravitate. At the same time I don't think I need 240hz, 100 seems fine, 60 was good too.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: