Apart from programming, one of the motivations for getting the 8K display is to look at lidar point clouds. For example the desktop background in my post is a lidar map of Bernal Hill in San Francisco, which I've here downsampled to only 13006 x 7991 px for your convenience [1].
Admittedly, when I bought it at first, I didn't realize there would be so many random issues, as manufacturers all advertised their gear as "8K Ready" even in 2021. As I incrementally fixed the problems, I decided to document my journey in this blog post.
btw I posted this in the past but it got caught by the spam filter and disappeared [2], not sure how to appeal that when it happens. Thanks ingve for posting it again!
I am using a 43 inch 4k monitor so I am allin on big screen real estate. But I find that even with a quarter of your screen area, I struggle to read the corners of the screen, the bottom is often obstructed by whatever is lying on my desk, and I had to make the mouse cursor bigger as I kept losing it. I doubt that an even bigger screen would be practical. I do have two 43in monitors side by side but the other one is more like a secondary screen for playing movies or "storing windows", it's too far from the eye to be useful as a primary monitor for reading and writing.
I had the Dell 8k monitor you mentioned, the picture quality was great but it died after a few years not long after the warranty expired (a gut punch at the purchase price) and they said too bad so sad... ok that's fine but I will never buy another Dell product again. It was released too early to have proper displayport support and I had to use a custom nvidia-driver X11 config to make it mostly work as two monitors. And there is basically no way to use that kind of DPI without scaling.
I replaced it with an LG 43UN700 which is a 43" 4K display that I use unscaled and although the LCD panel is vastly inferior I love the thing especially at the price point (under $700). I hope manufacturers continue to support this niche of large singular flat displays because they are fantastic for coding, data viewing/visualization and pitch hit at content consumption as your article states although this one would be no good for gaming. And getting a "monitor" or "professional display" firmware load means a lot less problems than a Smart TV load.
I had a similar experience with Dell after they wanted the price of a new laptop for a replacement laptop battery. This was for the Dell Studio back when battery packs were made to be swappable by simply sliding a latch.
After that phone call to customer support, I made a similar vow to never buy another Dell product. These days, I use a Framework laptop.
I did in fact buy a knock-off from ebay battery, but it kept it kept it's charge for hilariously little time. Had to run it of mains power permanently (ran it as a little server for a while).
FWIW, it was the same (even at the enterprise level).
We had a commodity (local cloud) computing Dell infra in the mid 2010s and were constantly replacing/returning “simple stuff” (fans, support flanges, memory, NICs).
“Dude, you’re gettn’ a Dell” became—-nope, never again.
I use glasses (myopia) and can kind of tolerate the edges of my 32" 4k monitor, but I can't fathom craning my neck all the way up to the edges of a 55"+ display. Not to mention font sizes.
I have fairly bad eyesight with both myopia and astigmatism (-5 sph, -2 cyl) and I wear glasses. I got glasses with 1.71 index lenses, which I greatly prefer over the more common 1.74 index lenses due to the higher Abbe number, resulting in less chromatic aberration.
Anyway, I use browsers at 150% scaling usually, although the text is finer on my terminals. I don't use any scaling for UI elements and terminals. Using the i3 tiling window manager, I put more commonly used terminals on the bottom half of the screen since I find that the top half does require more neck craning.
FWIW there are lenses that are high index while still having a higher Abbe number, but they're expensive and pretty specific materials. Interesting that 1.74 are more common where you are, where I am lower index polycarb are the standard (sadly)
You don't maximize windows except to watch videos at that size. It's more like having multiple monitors with fluid borders. You focus as needed, leaving the rest in your peripheral vision. That said I did miss maximizing windows to focus on tasks.
I had a 55" TV as my main display in 2022. Had it about a foot away from my face. It takes a few days, but your brain and body get used to the size.
I just bought a 39" ultrawide and for the first few days I thought "oh dear, I have to keep turning to see the whole thing," but I've not even thought about it for a couple of weeks now, so I guess I'm acclimated.
I have been using a 32" monitor for the last 10 years. I have found that I am using mostly the center of the monitor. The peripheral edges remain unused.
If I sit far from the monitor, then the FOV could be reduced, but then I have to increase the font size defeating the very purpose of maximizing screen real estate.
This is pretty much what I concluded as well after using my 43" 4K LG monitor for about 3 years. Lately I've been trying out my wife's 27" Apple Studio Display. It's smaller but the PPI is amazing...
Nice to see other people doing the same thing I do, albeit with a 4k OLED instead. I am waiting for an 8k OLED at an affordable price but it seems I will have to continue waiting.
What brand and model of desk do you have? I have a 48" TV but I sit rather close so it probably takes up the same field of view as your 65".
As to your last paragraph, if you email hn@ycombinator.com and explain the situation, they'll sort you out and sometimes put you into a second chance pool, as it's called.
I wish deep desks were more common! Modern ultrawide curved monitors sit way too close for comfort for me due to the way their legs have to be angled further back for center of gravity. custom desks end up being so expensive.
i'm using a nice sheet of 4x8 finished plywood from the hardware store. i trimmed the depth down a bit, but not much. put some edge banding on it, and stick it on top of a flexispot or whatever other 4-legged desk frame you want to use.
How did you get it in a custom dimension? I'm almost tempted to just put two of my current desk back to back to make it deeper, would probably be much cheaper than 2k, but then again, they're not standing desks.
What zoom (if any) do you typically run at? For instance, a 200% zoom would give you an effective resolution of 4K, but with much sharper and smoother text and rendered graphics.
The power is not so bad, especially compared to the graphics cards you would want to use (and I use my GPU as a tow warmer). Samsung 8k specifically comes with low power presets which are probably usable in this scenario. Of course with so many more pixels in 8k than in 4k there is need for more power but the EU regulation allows selling them if they also support an eco mode.
I am old enough to recall 100W as the typical single light bulb and I still use an electric tea kettle that touches the multi kW range daily.
hey, i asked you on the other thread as well (the imac one) but this was my question
—-
Hey, I have a similar setup (https://kayg.org/uses) where I use a LG C148 as my primary TV and monitor. I do all work on it, however I am unable to use tiling window managers as you recommend because I always struggle to see windows / text that is placed above my eye-level.
For that reason, I prefer to use manual window management solutions instead.
I am curious how do you deal with that problem, one big TV user to another? or do you not have that problem at all?
thanks!
Yeah it does emit a bit of heat. I think around one or two hundred watts? I haven't measured it directly. I have a mini split air conditioner in my home office.
The comment above has very wrong numbers, by the way, typical consumption for the whole device should around or less than what that poster claims is drawn just by the CPU!
I recently acquired a 43" 4K monitor for programming - a very boring Philips monitor, used at 100% scale. I hated it at first, but after a month I loved it.
2160p actual 'workspace' resolution at this distance (2 feet?) and size (43") seems close to a practical limit for typical use I thought, requiring with this measly 43" still a little bit of occasional head movement to see the top right corner. I noticed a tendency to sit slightly to the left of centre on this monitor, to avoid distortion and maintain clarity with what I'm focusing on (e.g. code/windows, not reference materials). Because of this I suspect at this distance a 43" with a slight curve would be optimal, at least for me.
What I wanted to ask you:
- What is your 'workspace' resolution? Is it something like 6K? I'm guessing your scaling is either 125% or 150%? Your PPI should be around 135, mine 102.
- Are you actually sat perfectly centre? I was wondering this because I keep noticing I tend to gradually shift my keyboard to the left over a day. Maybe this is years of 1440p + side portrait monitor use, I'm not sure, but eventually I accepted that I prefer slightly to the left (odd because my side portrait was on the left...)
- Do you think a curved monitor at this size/distance would improve the ergonomics? I imagine you must get a bit of a neck workout.
After getting this monitor, I'm pretty much sold on single screens again - but I had to switch my window management from keyboard-based tiling shortcuts to 'hold CTRL and move mouse' window management (BetterTouchTool on MacOS), with a tendency to stack up windows messily. I tried custom resize snap zones with BetterSnapTool - but I don't use them. I think that was the biggest challenge to switch from multi monitor to large format. It's a huge benefit to have everything in your context on one screen, but had to rethink how windows get moved around. Now I'm used to it, I want CTRL/SHIFT + mousemove modifiers on every system to deal with windows.
Also related, I bought a 4K tv last weekend for another system to use as a monitor, but found that the gaps between the pixels were unexpectedly large, creating a strange optical effect at close distance, making it unusable (but so close). There might be something different about the screen outer layer (on most TVs?) that polarizes light in a way better suited for distance viewing, but clearly not all TVs have this issue.
I don't know many Linux users doing 4k+ at 144hz. I am wondering if you do any screen capture or desktop recording, and if so what software you use and what your experience is like? I cannot reliably capture 4k/144hz with my setup but my desktop environment is still on X11. I tried KDE/Wayland and had a better experience, but run into other bugs based on their integration.
Just curious how your experience with sway has been. I installed it but wasn't expecting to come with no config at all and didn't really want to be bothered setting it up just to test screen recording.
The issue with X11 is that even if you record (using any software) it causes the display refresh rate to artificially drop and its a very bad experience overall when you run at 4k144hz. Ultimately, the future is wayland but I am a little surprised how slow it has been for everyone to integrate it into their software.
Yes. It makes the experience much better when anything is moving. Hard to convince by words, it’s a try it and then go back to 60 to see what you’re missing.
Similar to hard drive vs SSD. Before I used a machine with a SSD for the first time hard drives were fine, then my normal was conditioned to that of SSD speeds. Going back to hard drive speeds is painful, just like 60hz even for things like moving windows around the desktop.
They are useful for the same reason response rate is important -- motion blur and judder. Things look more crisp and move more fluidly across the screen.
What you really need to match is the angular resolution in microradians from your eye. You can make any screen smaller by sitting farther back. That said, I do wish my TV was only 42". I guess if you really want the ppi to be exactly the same as a 27" 5K screen, then 27 * 7680 / 5120 = 40.5".
This is exactly the reason I intend to stick with 4k for now: I don't want a display that large. I currently have a 48" 4k display, and I'd prefer to have a 42" or 36" one. (Good choices are hard to find, though, particularly if you actually want 4k rather than ultrawide, want OLED, and don't want to just use a TV.)
I bought the Philips Evnia which fits perfectly into that category at 42". Despite being a gaming monitor it's not garish and I've grown to love the ambilight.
Interesting. Time to buy a new tv or monitor for programming. Wonder which resolution and size to go. Use 4K 27 for programming and a super wide for my fs2020.
Btw I would use two different glass when I use it as tv or playing fs2020/4 vs when I sit close to use it as programming station.
I was previously working at a lidar company and now I am working at a robotics company providing calibration and localization software to customers using a combination of lidars, cameras, and other sensors.
> There is also a Dell UP3218K, but it costs the same as an 8K TV and is much smaller and has many problems. So I do not recommend it unless you really don’t have the desk space. Sitting further back from a bigger screen provides the same field of view as sitting close to a smaller display, and may have less eye strain.
You COMPLETELY missed the elephant in the room : 8K TVs have really, really massive CPUs that waste a TON of power (150-200w for the CPU, 300-400w for the TV, often!) Think 8 cores of the fastest arm 64-bit processors available plus extra hardware accelerators! They need this extra processing power to handle the 8K television load, such as upscaling and color transforms - which never happen when you are using them as a monitor!
So, 8K TVs are a big energy-suck! There's a reason why European regulations banned 100% of 8K TVs until the manufacturers undoubtedly paid for a loophole, and now 8K TVs in Europe are shipped in a super-power-saver mode where they consume just barely below the maximum standard amount of power (90w) ... but nobody leaves them in this mode because they look horrible and dim!
If everybody were to upgrade to an 8K TV tomorrow, then I think it would throw away all the progress we've made on Global Warming for the past 20 years ...
Anecdotally my house draws 0.4 kW when idle and 0.6-0.7 kW when both my 8K screen and my computer are on. Since my computer draws 0.1-0.2 kW, I surmise that the QN800A doesn't draw 300-400 W total --- maybe 100-200 W.
I run my screen on a brightness setting of 21 (out of 50) which is still quite legible during the day next to a window.
Also, I have solar panels for my house (which is why I'm able to see the total power usage of my house).
The parent comment is completely wrong on nearly every point it makes. I don't know why it's so upvoted right now.
It doesn't even pass the common sense test. Does anyone really think TVs have 200W CPUs inside just to move pixels around? That's into the territory of a high-end GPU or server CPU. You don't need that much power to move some pixels to a display.
I didn't smell anything. A 200W PSU isn't terribly expensive and being cheaper than more efficient processors seems reasonable. I also only run a single 4k monitor so haven't thought about driving 4x the pixels recently.
That's a facially absurd statement. Just on the numbers:
The US consumes 500 gigawatts on average, or 5000 watts per household.
So if every household bought an 8K TV, turned it on literally 100% of the time, and didn't reduce their use of their old TV, it would represent a 10% increase in power consumption.
The carbon emissions from residential power generation have approximately halved in the past 20 years. So even with the wildest assumptions, it doesn't "throw away all the progress we've made on Global Warming for the past 20 years ...".
To put it in perspective, an electric car might need 350 Watt-hours per mile. A 10-mile drive would use 3.5 kWh. That's equivalent to about 24 hours of using that monitor at normal settings, or about 8 hours at maximum brightness.
The comparison doesn't make sense, though, because if you drove to the office you'd still be using a monitor somewhere. A single 4K monitor might take around 30-40W. Using four of them to equal this 8K display would come in right around the 139W typical power consumption of the 8K 65" monitor.
There's no "fixed budget" of energy that is ethically ok to use. The parents point was that these devices are woefully inefficent no matter which way you look at them.
The "best" thing to do would be neither, and is usually to just use the device you have - particularly for low power electronics as the impact of buying a new one is more than the impact of actually running the thing unless you run it 24/7/365
> There's no "fixed budget" of energy that is ethically ok to use.
Not even 0.00001 W? How is it ethical to live in the first place in such case?
> The parents point was that these devices are woefully inefficent no matter which way you look at them.
It's always a trade off, of productivity, enjoyment vs energy efficiency, isn't it?
If I find a setup that allows me to be more productive and enjoy my work more, certainly I would need to balance it with how much potential waste there is in terms of efficiency.
> The "best" thing to do would be neither, and is usually to just use the device you have
That's quite a generic statement. If my device is a budget android phone, do you expect me to keep coding on it, not buying better tools?
> You COMPLETELY missed the elephant in the room : 8K TVs have really, really massive CPUs that waste a TON of power (150-200w for the CPU, 300-400w for the TV, often!)
RTINGS measured the Samsung QN800A as consuming 139W typical, with a peak of 429W.
Your numbers aren't even close to accurate. 8K TVs do not have 200W CPUs inside. The entire Samsung QN800A uses less power during normal operation than you're claiming the CPU does. You do not need as much power as a mid-range GPU to move pixels from HDMI to a display.
> There's a reason why European regulations banned 100% of 8K TVs
This is also incorrect. European regulations required the default settings, out of the box, to hit a certain energy target.
So large TVs in Europe (8K or otherwise) need to come with their brightness turned down by default. You open the box, set it up, and then turn the brightness to the setting you want.
> until the manufacturers undoubtedly paid for a loophole
This is unfounded conspiracy theory that is also incorrect. Nobody paid for a loophole. The original law was written for out-of-the-box settings. Manufacturers complied with the law. No bribes or conspiracies.
> If everybody were to upgrade to an 8K TV tomorrow, then I think it would throw away all the progress we've made on Global Warming for the past 20 years ...
The Samsung QN800A 8K TV the author uses, even on high settings, uses incrementally more power than other big screen TVs. The difference is about equal to an old incandescent lightbulb or two. Even if everyone on Earth swapped their TV for a 65" 8K TV tomorrow (lol) it would not set back 20 years of global warming.
This comment is so full of incorrect information and exaggerations that I can't believe it's one of the more upvoted comments here.
> RTINGS measured the Samsung QN800A as consuming 139W typical, with a peak of 429W.
Can you explain why does a TV's power fluctuate so much? What does peak load look like for a TV? Does watching NFL draw more power than playing Factorio?
> The average American household uses about 29 kilowatts of power per day (29,000 megawatts).
Ignoring the megawatts error that the sibling pointed out, it's 29 kilowatt hours per day. Watts are a unit of power consumption -- joules (energy) per second.
One kilowatt hour is the energy used by running something at 1,000 Watts for one hour.
To be fair it's not the energy that you're concerned with; it's the source of that energy.
Private jets can't run off nuclear power grids. Also the real problem-child of emissions is not America. China has a billion more people, what are their TVs like?
Good points. I would go further and say it is the integral of emissions over time that we would be most concerned with. From that perspective, over the last 200 years - it is problem children, and rising problem childs.
I also recommend this. Been using 43inch 4K TV since last 10 years. MY first TV (Vu 43inch Iconium) died this year, Got another 43inch 4K TV from LG (43UT8050) now. Ensure you get one that supports atleast 60hz refresh rate. My first one did not at 4K. It even starts faster than android TVs. I always keep on game mode, this setting ensure minimal input latency and no TV side post processing. The smart TVs don't need to be connected to internet, since I don't use the smartness of theirs. Finding dumb TVs is difficult here.
> The bezels and gaps in between the monitors introduce distractions and one is limited in how one may arrange terminals and windows across multiple displays.
To me, the segmentation is a feature. It lets me offload information density and focus. For example, I commonly have an editor on one screen, a browser on the second, and something like a chat app, terminal, etc on the laptop screen.
Nobody's stopping you from segmenting one big monitor into different regions; and you get to choose how big those regions are from day to day rather than being forced into it.
They tend to be relatively poorly handled by the software, at least out of the box.
Every modern major OS now has some level of tiling/splitting on a monitor's edges baked into their window manager by default now. Some can be tweaked to split into smaller subgroups, but that often requires less well tested/polished options (some apps just ignore the hints), or even third party extensions.
That's too much extra work. With multiple monitors you can maximize primary apps while still having manual management of smaller supporting apps on another monitor. You also get more edges for rapid snap to the sides of a monitor.
Since you seem to know about the best window managers, can you recommend one for MacOS which will let me direct focus to whichever window is left/right/down/up of the currently selected one? i3/sway does this just fine, but my impression is that MacOS's api doesn't allow third party developers to pull it off, but I'd love to be wrong about that.
Not the person you were asking, but after years of using i3, AeroSpace is the only way I can use a Mac productively, and does indeed have the feature you're describing.
Wow. Thanks for this. I've bounced off a number of tiling WMs on MacOS over the years - Amethyst, Yabai, others I can't remember - but Aerospace is really excellent. Can't believe I've never heard of it before. Love the custom implementation of spaces as a solution to what ails a number of other tiling WMs. I installed it this morning and disabled the mish-mash of Rectangle Pro, Better Touch Tool and OS kb shortcuts I'd been using.
it has quirks and limitations, some of which can be fixed by disabling system integrity protection but it can definitely handle window tiling and navigating with keybindings when you use the companion daemon https://github.com/koekeishiya/skhd
I use yabai which does what you say and more pretty well. It also lets you completely remove spaces transition effect but this will require disabling of SIP.
Although if that big monitor is an OLED, segmenting it into halves or quarters is kind of begging to end up with a line burned in down or across the middle eventually.
Samsung solves this in the TV itself. It can be annoying when the edges of the screen are ever so slightly off, but i'm glad I don't have to worry about it. QCQ90S. I wouldn't recommend it since the tv's gui is glacially slow, but then again all the ones I tried last year were.
I have done this in the past using a tilling window manager and it's still better to use different displays. There is something about our monkey brains that makes 'different physical object = do different things' work better than all having it on the same monitor.
I did get it to work for me with thick black bars between the screens, but when you're giving up an inch of screen real estate for every virtual monitor then you might as well get physical ones.
I use a single ultrawide at home and dual-monitors at work.
Initially thought having one monitor experience was more seamless, but I do miss implicit window organizational aspect that dual monitors provide. And screensharing on the ultra-wide is a pain.
My Samsung ultra wide has side by side mode with two input cables. Screen sharing (and Windows) thinks it’s two monitors but I can stretch windows all the way across both if I want to since it is an extended set
Best of both worlds, I wish there were a way to configure this within the OS so that you could make a single screen appear like 2, 3, or 4 logical screens.
A decent window management tool (e.g. Rectangle.app) should resolve most of your window management issues - set up many drag points to easily divide windows by half, thirds, quarters, sixths, etc.
Most screen share apps should support sharing by window. Also best for privacy (so your viewers don't see the side channel chat notifications pop up).
Also an ultrawide monitor is preferable for spreadsheet warriors.
I will not give up my 49" 21x9 for anything lesser.
If your ultrawide is anything like mine, it also has a setting that lets it register as two separate monitors (PIP/PBP mode), which is like having two monitors without the bezel, but with the convenience of "there's an edge" in the middle of your screen when doing regular desktop work.
Does require two cables of course, but if you're driving an ultrawide, you're probably using a graphics card with three or four outputs anyway.
Same, I've never liked spanning a window across multiple monitors. The discontinuity of the bezel is a handy mental break. Often I'll have email and teams on one screen and my main item of work on the central screen.
Same. This utility is also multiplied by having a separate set of virtual desktops on each display, which lets one create sets of windows/apps that can be mix-matched between screens, reducing the amount of window-shuffling to almost nothing after initial setup.
This is only possible under macOS and Linux, unfortunately. On Windows virtual desktops are still kind of a weird hack that spans one desktop across all monitors.
Yeah that issue seems weird to me, because I've never found bezels themselves to be that much of a problem. Like sure, less bezel is better. But I have some pretty wide gaps in my work monitors, and I've never found it to be a problem.
This article, and a lot of "productivity" articles, feel like spending a lot of time and effort for marginal-at-best improvements. I don't know their specific workflow, but I'm pretty sure they could get basically the same amount of productivity with a handful of 1080p monitors.
After 15 years of having a desk job I find that I’m more sensitive to the position I sit in. My back feels a lot better if I have a single, regular sized screen right in front of me, instead of having additional screen estate on the sides or below (as with a laptop).
At the same time I use virtual desktops that I can switch with both keyboard and mouse.
The general advice is to have top of monitor at eye level, but it's been wrong advice for me personally. I now put the middle of the monitor at eye level. Keeps my head up and posture better. Leaning back instead of stooping.
The general advice provided to me, and relayed by me is eyes centered @ 2/3th of the screen.
The best advice received and relayed by me regarding posture might surprise you.
If you struggle with posture, stop caring about what other people might think about your posture. Changing/Tweaking posture all the time might look bad, but it also tends to mitigate the effects of being frozen in bad posture(!) The health impact is too significant to ignore.
Yeah I think the only ergonomic advice I believe anymore is that there does not exist a position that is ergonomic to sustain for more than a couple hours. Humans are not evolved to stay stationary, few mammals are really.
I do this too, though mostly out of necessity. I use a 27" screen a couple feet away. To get the top of the monitor level with my eyes I'd either have to lower it so the bottom of the monitor was almost flush with the desk (which my current monitor's stand won't do anyway), or get a taller chair/lower my desk, both of which would leave my legs rubbing up against the desk underside and my arms at an uncomfortable angle for typing.
Either I have an abnormally short torso, or that advice was written back when most people were using a 14" display.
Indeed. AIUI your head needs to be back, chin tucked in, which means looking down a bit. If you're looking level or up you're going to be sticking your head out a bit
I'm the same. I use a single 27" 4k monitor and use virtual desktops. The best upgrade for me though was getting a computer prescription for some glasses that I keep on my desk.
Sometimes I think about upgrading to a 5k monitor. The Apple Studio Display looks great, but I'm a Windows user and I'm guessing a lot of the nice features of that display are Mac-only.
There aren't a whole lot of options for 5k monitors. Other than Apple I think there's a Dell, but it's too wide. There's a Samsung but I've been burned by Samsung too many times. There's also an LG 5k monitor but it gets pretty weak reviews.
> The Apple Studio Display looks great, but I'm a Windows user and I'm guessing a lot of the nice features of that display are Mac-only
I can possibly be of some help here. I have a Studio Display, however my work-provided machine is a Dell laptop and so that is what is connected to it most of the time.
Providing your machine can output video via Thunderbolt or USB-C, it will work. That is fairly common these days, though Windows machines capable of driving a 5120x2880 signal can be harder to come across, particularly in the corporate laptop world, though I don't know how much of a concern that is to you.
My last work machine maxed out at 4K which the Studio Display would happily scale up to full screen. I would describe it as substantially sharper than e.g. a 2560x1440 display of equivalent size, but still noticeably less sharp than the full native 5K (obviously). My current machine can do the full 5K, but the performance leaves a lot to be desired (however the thing is a turd anyway, too much corporate security crap bogging it down).
Speakers, camera, and microphone built into the display all work totally fine from Windows. What may be a total non-starter is that you need a Mac or iPad to change the brightness, because there's no physical controls on the display itself and Windows doesn't expose a way to control it. I am lucky/unlucky in that my home office does not get a huge amount of natural light, meaning I've been able to set it to a comfortable brightness from my Mac and then just leave it.
Overall it's a very nice monitor if you can work around the brightness thing. A possibly better contender though is the recent-ish 5K variant of the Asus ProArt[0]. I was using the 1440p version of the same monitor before I got the Studio Display, and I was very happy with it. Good colour reproduction, USB-C Power Delivery for one-cable laptop docking, and a far more adjustable stand than the SD. Worth a look.
I've got the LG 5K and it's been totally dependably kick ass for the 4 years (i think) since I got it (from the Apple Store). Mostly using it on macOS but have used it with Windows and haven't tried with Linux.
Agreed. To each their own, but the obsession with the biggest and/or most possible screens is something that is very hard for me to relate to. As soon as I am regularly craning my neck to see all of my screen real estate, it is no longer a positive in my life. I'm glad these solutions exist for people who enjoy them, but they are definitely not for me.
Same here. I only use and want a single monitor setup. I can alt-tab between windows faster and more comfortably than turning my head to another screen.
Also a dual/multiple setup bothers me for losing the mouse boundaries when it crosses to another screen - I'd rather have the mouse bounded on one screen for faster access to menu bars at the edges.
Have been sporting a 4K LG CX48 OLED since ~Sept, 2020 best monitor decision ever. I've got two HDMI out cables, 1 going to my gaming rig and the other for my Macbook where I do my work as a developer.
I haven't noticed any burn-in or dead pixels. You need to set it up for success, enable all the burn-in prevention settings the monitor provides (static image darkening, pixel shifting&cleaning). It's also a great idea to do other things such as sleeping the monitor after 1min if inactivity, no screensaver (or just black), black desktop background, hide taskbars, etc
edit: to add, i have the monitor mounted to the wall and about 1" above the height of my desk[1] - this puts the center of the screen directly at eye level
I stole one of these from Best Buy for $500 in march. It’s just so good. I haven’t turned off the local dimming thing with the service remote so that’s still a thing but damn is it such a great monitor. And for gaming cyberpunk at 120hz with hdr melts your face.
I upgraded recently, by buying a friends old Samsung Odyssey G9 49" curved monitor off him (he was emigrating). Before that I had 2 x 27" monitors, a setup I had used for ~10 years.
I honestly think the curve is essential when dealing with such a wide display. The alternative would be - as article states - to set it back a little and have a deeper desk so you can actually see the edge of the screen properly. I don't see the point in having a large screen with high pixel density if the edges are not actually easily visible to me without moving my head or body laterally.
The lack of bezels is great though - I'd definitely agree on that front, having 3 web browsers or editors open side by side suits me really well.
It’s different from person to person!, whether the curve is good or not.
I have a ruler flat 55” OLED TV as main monitor. It’s perfect for me. I’m like… 1-1.5 meters from it where I’m closest to it, haha. The edges are further away. It’s fine! – imo / ime.
(The need for the curve is also subtly different depending on how the panel was made. I tried a flat 43” IPS 4K monitor, expecting IPS to be good. And it wasn’t very good. The IPS features in that panel were large enough to affect viewing angle.)
> It’s different from person to person!, whether the curve is good or not.
The amount of curve also varies a lot between models so there's some nuance even within that. The curve might be as strong as 800R or as weak as 2300R depending on the monitor, where the number corresponds to the radius of the circle the panel follows in millimeters.
Same, though I'm also on 49" (5120x1440). They're selling them for extra cheap on Amazon with extended (36mo) warranties because they're prone to breaking, but I had the Samsung contractors out here this month and they did a great job fixing mine that randomly died one day -- for free! If you're a chill soul, I'd say it's worth the risk.
I sound like a shill, so Samsung plz hmu. $999 for a beautiful OLED monitor that fits a terminal, a browser, and 4 (font size 8...) 100col text editor windows is a gamechanger.
As weird as the aspect ratio can be on a curved ultrawide, I think it's also more natural and ergonomic to keep your head/eyes at a constant height and just move them side to side. With a monitor that has a lot of verticality you're gonna have to tilt your neck back more.
Low response time (i.e. time it takes for a pixel to change color) to reduce ghosting, and a high refresh rate up to 240 Hz.
These monitors are expensive and do not have very high resolution. If you're not a hardcore fast reflex gamer, and you spend a lot of time looking at text, then IMO it's better to buy a higher resolution monitor for less money.
I think at that point it’s not really conscious any more? It always takes me a little while to realize my monitor somehow went to 30hz, and that’s why I’m feeling something is off.
4K gaming monitors do provide a reasonable middle-ground between "extremely fast but only 100-110ppi" and "extremely high res but only 60hz" now though. You can get 163ppi at 144hz without breaking the bank, which isn't quite retina by Apples definition, but it's good enough for me considering the benefit of high refresh rate.
I'm guessing because it allows you to set the Field-of-Vision to be pretty wide?
I mostly play simulation games, particularly flying, and having a wider FoV makes things easier, until you're ready to go to the top step of using VR instead so you also get depth perception and essentially 360 FoV since you can rotate your head.
I wonder what the math would look like to properly render 3D scenes onto a curved display. Could it be accelerated as well as the regular matrix operations used for perspective projection onto planar screens?
During the pandemic I did try out my 4K TV as a game monitor. I had a combination of furniture so that I could sit rather close with my eyes approximately half way up the screen, with a keyboard and mouse in a reasonable position. Then, using an older FPS game I got it to where my laptop GPU could hit good frame rates and I adjusted the game's viewing angle to match how the screen fit my field of view.
It was deeply immersive in spite of me being so close I could "see the pixels". The only time I've felt more immersive was demoing Quake in a 3 wall + floor CAVE at a national lab decades ago.
> I wonder what the math would look like to properly render 3D scenes onto a curved display. Could it be accelerated as well as the regular matrix operations used for perspective projection onto planar screens?
The math is pretty simple to account for a curved viewport, even though I don't think any apps actually care about that. Most displays aren't curved enough to make it a meaningful difference.
We don't have fixed function pipelines anymore either so that could definitely be handled by hardware.
This used to be much more true, but almost all PC games support 21:9 now and 32:9 support pretty common too. "most games" screwed up is an exaggeration IMO. Even on games that don't officially scale, on PC they almost always have customizable FoV that gets the perspective correct again. Many modern games are even smart enough to rearrange the UI so that the critical info (health bars, ammo counts etc) is in the center of the display and not attached to the edges.
PC games have kinda been forced to support ultrawides whether they like it or not - the 21:9 class especially has exploded in popularity for gaming PCs.
I've gamed in 32:9 for years now - I wouldn't go back. The curve is not exaggerated enough to be a meaningful projection issue on most curved displays and games.
It's the curve that messes things up. It's just significantly more incorrect on wider displays. Many monitors are 1800R, and that's easily curved enough for the projection error to be quite pronounced at 32:9 using a planar projection.
32" Odyssey G7 is the pick for me, I wouldn't mind an upgrade to the 4k version, but the 1440p version is more than good enough.
I also don't see the point in having a screen so big I have to move my head, or contrarily a screen so big that I have to push it back so the pixel density matters much less.
According to https://tools.rodrigopolo.com/display_calc/, a 65" 8K like the one in the article is retina at a 26" viewing distance (136 PPI). For reference, a 27" 4K screen has 163 PPI, and is retina at 21" by the same math. A 27" 5K (like the Apple Studio Display) has 218 PPI and is retina at 16".
The DPI of this screen is too low for all the drawbacks. Would rather have crisper text (150+ DPI, 200 preferable) and/or be able to carry it myself. Needs to be about 42" for that.
The proper monitor height is when the top third of the screen is at or slightly below your eye level when seated or standing upright. This positioning helps prevent neck strain and allows for a comfortable viewing angle.
The top third of a large TV will be much higher than that, which will cause long term discomfort.
That's why large monitors have much wider aspect than TVs.
Yep a huge monitor sounds good in theory but you end up with neck and eye strain from panning your head constantly unless you place it so far away that it’s effectively a regular monitor at a regular distance.
Would recommend a black background Vscode theme for an OLED. The black background with red accents looks beautiful, at least on my smaller XPS 15 4k OLED. I use Dobri Next Black with some customizations but it looks good by default as well.
I got my 8k 55" tv for under 1000 usd several years ago. Brand new, from a brick and mortar electronics store. So it is definitely possible to make 8k monitors for less than 1000 usd.
A mere 55" with 8K resolution makes no sense as a TV, but it's glorious as a productivity monitor. But instead of becoming commonplace as monitors, the panels seem to just be disappearing even as TV's. At the moment I can't find anything at any price that can replace my current setup.
The market isn't working for monitors. Everything available now is either crap, or costs 10x more than it clearly could. Millions of people are spending years of their lives in front of bad screens because monitor makers don't want to make good ones.
I feel like Apple's 30 inch 6k display would be the sweet spot for me, but its 60 hz and cost what.. $6,000 ? I just use 27" 4k monitors for work. It's fine but I'd definitely like something a bit bigger and even crisper. I have to use windows for work though.
“It can display seven equally spaced vertical columns of text (critical importance), has driver issues (minimal importance), wake issues (who cares), it costs as much as four smaller monitors (this is good), I need a huge desk (hell yeah), there are multiple image quality issues (well it’s not like I have to look at it all day)…”
It is like “I spent fifteen hundred dollars on a multitude of hassles due to purchasing the wrong type of display, but due to the lack of bezel this is a prime efficiency move “
I chuckled at "The 8K display is only $1500 at BestBuy!" the "only" lol I spent $400 on my projector that I use for my main screen and it works great. But when I did that I had previously only bought $200 projectors. So even that was not an "only" for me.
I've never spent more than $75 on a monitor. I only buy used. Monitors depreciate like crazy and businesses are constantly getting rid of them, even when they're only a few years old. Yeah, you aren't going to get some 9001Hz 10K giga-OLED whatever, but I'm a programmer. If it displays text with reasonable contrast without hogging my whole desk, it does everything I need it to do.
The most expensive one - the $75 one - is a 24" 1920x1200 IPS display with HDMI, DP, VGA, 2x DVI, S-Video, and YPbPr composite. Never seen those last two on a monitor before, but there they are. I don't use that display as my main one anymore, but I keep it around because it's awesome and it plugs into literally anything.
Remember dropping a grand on a 30- inch 2560x1600 on the day and thinking that was the ultimate.
I The 40 to 45 inch is the ideal, otherwise screen real estate goes too far in the peripheral vision.
The other issue was a lot of really big screen. Real estate is managing lots of Windows. With dual screens you can usually been in Mac's ride of applications more easily than with one cuz when you Max on the super big screen it just takes up everything.
And pushes the usually the most relevant stuff is the upper left hand corner that goes to the upper upper left left corner, which actually is pretty far out of your main field of vision.
But I still love the 43-in 4K TV I've been using since 2010 or so
Only the first one (dirty screen) is a real issue, but it is subtle and irrelevant to programming; the second one (checkerboard), as the post explains, is solved by toggling an option in settings.
> Driver issues
The post explains that it works perfectly with current NVidia drivers on Linux, and on Windows both AMD and NVidia on Windows have had driver support for HDMI 2.1 for years.
I'd give a lot to go back to my 20 year old eyes that could see pixels without special glasses. Sure I can't see pixels (well maybe I still could on an janky third party CGA monitor from 1983), but it isn't worth it. (I'd say save your eyesight, but realistically I'm not aware of anything you can do to keep it past about 45)
I've used both. I quite honestly don't care. I've heard many people that share your sentiment. But some of us just don't. Visible pixels are totally fine for me.
I went back from using different displays in HiDPI to using a single 43” 4K screen set to 100 % scaling. Screen estate trumps invisible pixels [for me, at the moment].
I think you'd have to sit further back than is otherwise natural (and then have the issue of legibility/lost workspace) to achieve "can't see the pixels" on this.
Sure it's 8K but it's 65", it's only got a PPI of 135. For comparison Apple (computer) displays and a handful of third parties that target Mac use are generally 200-220 PPI. That is can't see the pixels density, even if you smash your face against it.
220 ppi output with no subpixel rendering (ie modern Macs) has clearly visible jagged edges in angled lines and letters if you've got good vision or correct your vision to better than 20/20 (my case: I get headaches if I don't).
If you are coming from typesetting world, laser printers from the early 1990s did 600dpi (dots per inch), and that remains sufficient for smooth lines, though newer printers will do 1200dpi too. Going down to 300dpi printouts is crap.
Heck, newer Kindles do 300ppi and that can clearly be improved.
Apple's "retina", like all things in life, does work for 90% of the human population as advertised, but there's still a big number of people who have better angular resolution than what they target.
I have a 55" 8K and I can't see the pixels while sitting 2ft away. Everything is crisp and I have a huge workspace. For mac I use 4k native so 2x integer scaling.
I didn't see any mention of how many times he has to pick up his mouse when it gets to the edge of the pad to get the mouse from one edge of the screen to the other.
Author here: I use a Logitech G Pro X Superlight but also I use the i3 window manager and rely on keyboard shortcuts for a lot of the navigation. I have the mouse sensitivity set so that the cursor can traverse the width of the screen when moving the mouse about 13 cm, without any acceleration. This is still precise enough that I can move the mouse pixel by pixel if needed.
I find it annoying that they've kind of got rid of mouse, tails and other easy ways of finding the mice pointer.
That's one of the main drawbacks of a massive screen is if you lose the pointer it takes a lot longer to find it. It's not a linear scale based on the width of the monitor. It's with the square.
So 50-in monitor is going to be about four times longer to find a mouse pointer than a 30-in one.
I don't like those hotkeys where you know it highlights it. I like the the mouse tail. That's the one that I can most easily find it. But generally those came out of fashion about 15 years ago
Pointer trails are still a feature in windows last I checked, and hitting ctrl to animate a circle around it works pretty much everywhere. I don't use either of these features nowadays, and usually find my cursor by moving it until I get to somewhere with high contrast.
I haven't seen anything I like quite as much for quickly finding the cursor as macos's "wiggle for giant cursor" feature.
The example he's chosen is of a ridiculously sized TV. 65" is living room TV size.
There are smaller, OLED displays that would be more suitable(while still rather big). Many are 'just' 4k, but the smaller sizes should give one a decent pixel size.
Have you had issues with image retention? I also like the 43” 4K setup for some things, but these days it seems IPS screens in that size are not as easy to find, I’ve always been wary of OLED due to burn-in
With the LG I'm about a meter or less away from screen and use window management tools to pull focus to the center lower section for any focused work. I run Win 11 from an RTX3080 card with a 2.1 HDMI cable. 3840x2160 120Hz.
For gaming I just use windowed mode and use the full width of the 65" but just the lower half usually for COD or FPS games. I don't notice any eye strain or other issues but do run everything I can in dark mode including using the browser with the Dark Reader extension.
I’m doing something like this in my current home setup, but the thing I miss most about multi-monitor is screen sharing on Zoom.
I used to be able to just share one entire monitor and could drag windows I wanted to make visible to that display. Now I tend to share single applications, and have to unshare and reshare to change the view.
First world problems and all, but it would be nice if Zoon let you partition off a part of a display (instead of all or nothing). Would love to draw a bounding box of “share everything in this box.”
I don’t think this annoyance is enough to make me go back, but there are times when I’ve considered it.
Deskpad might be what you’re after! It’s a virtual display in a window, you can share that instead of your whole screen but still get multi-app flows captured
From experience with a 55” 4K OLED as main monitor, I can attest that the length if the caveat list is not indicative of the total impact of the caveats. It’s more an indication of a thoughtful and thorough person writing the list.
I went with the LG CX model based on what I read on rtings.com
That’s a previous-generation model. I think all of the LG TVs are good.
There are / were technical caveats. I believe all of them are solved by M3 macs that have HDMI 2.1 ports. (M3 or M3 Pro or something? The ones advertised as 8K capable.) Out of the box, those will do 4K 120Hz HDR with variable refresh rate and full 444 color. This is what you want.
It is possible to get that going on older machines, except for VRR which is more of a nice-to-have anyway.
I have a 2018 Macbook Pro 15”. Disclaimer!: My setup was a “complexity pet”, a tinkering project; There are simpler ways to connect a 120Hz 4K HDR HDMI 2.1 display to a non-HDMI-2-1 mac. And! My tinkering project wasn’t only about getting the display working correctly. It was more about messing with eGPUs and virtualization and stuff. Definitely a long way round.
On my Intel mac, I use an AMD Radeon 6800 XT eGPU with Club3D or CableMatters DisplayPort-to-HDMI 2.1 adapters. Plus some EDID hacking which is easy to do.
EDID is how the display identifies itself to the OS. The EDID payload can be overridden on the OS side. Mostly it’s about copying the display’s EDID and deleting the entry that says the display can accept 4:2:0 color. Only then does macOS switch to 4:4:4 color. I also created a custom “modeline” with tighter timing to get 120Hz going fully.
—Please be assured that this was way more complex than it needed to be. It was for fun!
There are much easier ways to do this. Lots of forum posts on it. On the MacRumors forums iirc? User joevt is The Man.
And even then, what I wrote above is actually easy to do once you know it’s possible.
Mostly though you really want an M3 Mac that just has HDMI 2.1 and is ready to go.
There are/were also OLED gaming monitors available, such as from Alienware. Those have DisplayPort inputs and are ready to go with almost any older Mac. Might be able to find one for a price equivalent to a TV, idk.
I believe the discussion about text rendering is referring only to a line of very cheap TVs that do not in fact have RGB pixels. They have half RG and half GB. For "normal" video content, this is a surprisingly low quality drop. For high-contrast text it's total murder. You can see the stippling pattern as clear as day and it can easily render 8-10pt text literally illegible.
IT once accidentally bought such a TV and had it in a conference room. Took us a while to convince the relevant people that, yes, it is nominally working fine, it's not "broken" in the sense that it doesn't turn on or half the screen won't light up, but it was intolerable for Zoom screen shares.
But you need to be scraping the bottom of the barrel to end up with those screens. I doubt you could find something labelled a "monitor" that has that, and, well, if you're putting a $150 40" TV on to your computer... I mean... what did you expect?
(There are also low-end TVs that are still using some crappy LCD techs with bad viewing angles that may make them difficult to use up close, but I wouldn't call that a text rendering problem... those issues just wreck everything. I once had a laptop that when used on a lap, had zero viewing angles; if the vertical middle of the screen was correct, the top and bottom was extremely visibly color shifted. Even the cheapest store brand TVs don't seem to be that bad anymore, though.)
> I believe the discussion about text rendering is referring only to a line of very cheap TVs that do not in fact have RGB pixels.
It also comes up with very expensive OLED monitors, which do usually have true RGB or WRGB pixels, but their subpixels are usually not arranged in the standard horizontal RGB stripe which breaks most implementations of subpixel font rendering. With a sufficiently high pixel density it doesn't matter, but with the ~108ppi of a 27" 1440p OLED monitor the text rendering can be quite visibly worse than a 27" 1440p LCD.
> TVs may have a different subpixel layout than monitors, so small text may suffer fringing. As of writing the Samsung VA and LG IPS panels such as the QN800A have a conventional RGB or BGR subpixel structure. One may also increase the font size or use hidpi scaling which will eliminate all pixel-level concerns.
Back in my daze at Boeing, I had a full size drafting table in addition to the usual desk. I've always wanted a display that big. In fact, I want my entire desk surface to be such a display!
That's simply too big a screen to be sitting right in front of.
I do agree on the basic idea of not running two monitors tho. I used to, and I got neck pains eventually.
My current setup is a single 32" curved QHD monitor and I wouldn't change it for the world. It's just the right size so you can see the whole screen at once, yet large enough to run 3 browsers side by side.
Also, I want to suggest people to learn about virtual desktops rather than wasting money on bizarrely huge screens or multi monitor setups.
If you have it setup right you can flip to the other desktop quick, see what you want and flip fast. I haven't seen a good virtual desktop implementation since around 1998 though, and have given up.
55" is not too big. Maybe it's too big for you, but I've been using three 32" 4k screens in portrait for many years, combined they are essentially about the size of a 55" screen. I love it and anything less kind of sucks. No, virtual desktops are no substitute for having more screen size. I use virtual desktops on my massive screen(s) and I love that too.
The 3 32” screens are probably angled around you and the total aspect ratio is extreme widescreen (side to side panning, not vertical neck up down panning). The 3 screens are likely much much better ergonomically.
I used a tv as a monitor for a while and it was great -- but there is one problem with single monitor setups -- screen sharing/recording. If the app you're using lets you select a portion of the screen to share, that's great. But something like Slack you either share an app window, or the entire screen. This is very annoying in a single monitor setup. It would be amazing if you could select a part of your screen and tell the OS "treat this area like a separate monitor".
I have used one of the original 4k TVs-as-a-monitor ( https://www.avsforum.com/threads/review-of-the-seiki-39-4k-d... ) as my central monitor (plus one on each side) for 10+ years now. Not feeling any need to upgrade (don't do graphics/games, just lots and lots of text terminals and browser windows)
On most monitors I've been using these days, I keep scaling the resolution down. I've noticed that the bigger the text, the more comfortable my eyes feel. I still prefer a good high-res monitor because it scales down with less blur
I am excited for 8k monitors in the future, because they give you a lot more options for integer scaling than current 4k displays.
I know this a nerdish hill to die on, but I hate fractional scaling with the blazing fury of a thousand suns. To get a 1440p sized UI on a 27" 4k display, you can't just divide by 1.5x the OS has to 3x/2 for every frame. OS X does this best as they've had retina displays for a while, but no OS does this well, and it leads to all sorts of performance issues especially when dealing with view ports. Linux is especially bad.
Having said all that, I absolutely will not be using an 8k tv as a display. I'm currently using a 27" 1440p monitor, and while I could probably handle a 32" 8k display that is the absolute max size I'd tolerate. You start to get into all sorts of issues with viewing distance and angle going larger.
My 27" 1440p is fine for now. I sit far enough away from it that I don't really 'see the pixels' unless I go looking for them. It was also a crazy good deal as it's a 144hz monitor that also has a built in KVM switch that's very useful for WFH.
I wouldn't describe any OS as 'flawless', they're all doing what I describe under the hood. QT does have better support than GTK atm. I've also seen bad behavior on windows, esp with older apps. OS X is about the best out there, but even it can have issues with applications that have a view port (i.e. video editors, etc).
I'd prefer to skip all that so I'm happy staying on 1440p until 8k monitors are where 1440p monitors are today with regard to price and quality.
27” 1440p at 100% is too small for me, so 5K at 200% has the same problem. More generally, the available PPIs combined with integer scaling only yield relatively few options at a given viewing distance. More choice would be nice.
Beware of backlight offsets. TV panels can have smaller backlights, because they're meant to be viewed from further away, and my LG 46 monitor didn't have backlight behind the lower 2-3 rows of pixels and a couple pixels on the left and right, when viewed at my desk. This may not impact some people, but I often go full screen text and missing some of the left and bottom pixels was annoying. I ended up able to configure i3-gaps so that it never displayed anything in those areas, solving the problem. It worked great as a huge monitor otherwise.
In my experience when a flat monitor gets too large the edges tend to be too much further than the center and as I glance around, my eyes need to refocus too much. That’s why I vastly prefer curved screens. I currently use a 4K 32in curved monitor by MSI and for me it’s just perfect
I'd be happy to, but there aren't any 8K TVs at 55" or smaller. I want the pixels, but I'm not going to put a 65" TV on my damn desk -- I have two 27" 4k now, and it's ... fine, I guess? but I want a 42" 8k running at 2x.
If you’re looking for a monitor with high pixel density and a ton of real estate, you can also buy a monitor. 5k2k’s are pretty sweet. I’m driving one of these nowadays and it’s fabulous, without all the quirks of adapting a huge TV for computer use: https://www.dell.com/en-us/shop/dell-ultrasharp-40-curved-th...
HiDPI, two 4k monitors without a bezel, 120Hz, and no need for a separate thunderbolt hub.
I'm not a fan. Large ultra-wide curved screens are fantastic. With large flat screens that are meant to be viewed across the room, you get a distorted image when you sit up close. Your eyes have to focus further away as you look at things closer to the edge of your screen and the viewing angle for that part of the screen is different from the center of the screen. It also requires more effort for your eyes to look up and down rather than left and right. We're hard wired for that horizontal plane. This makes ultrawide screens a really comfortable option.
I almost bought an 8k 55" screen for use as a monitor, but I tested a 55" 4k screen for a week and the flatness is what turned me off to it. I've been using three 32" 4k screens in portrait, arranged in a "curved" config on my desk (2 monitors on each side are mounted at an angle), which I really like. But switching to a large single flat screen was not fun.
For me the holy grail of monitors is a 55" 8k curved screen. Not "ultrawide", I want the full width and height and I want it curved, with full 8k resolution. Maybe someday, but I'm not getting my hopes up too high.
I'm not the guy you asked but I have a similar opinion on flat screens. Personally I'd want spherical. ~15" tall and ~25" wide is about my limit for flat screens, anything beyond that I find that the corners/edges are too distant/distorted. My home setup is multiple independent 27" screens, which I like. My work setup is a single flat ultrawide (34" probably?), and I find myself physically leaning my head/body from side to side when I have two windows open next to each other. I have eye level a few inches from the top of the screen, and the lowest couple inches also seem distant/distorted.
This is something I've wanted to do for a while! I wish Samsung still produced their 55" 8K displays-- 8k @ 55" gives you effectively the same PPI as a 27" 4K display. Maybe someday.
That's a hell of a desk. And counter to the argument that "you could just have the one huge screen for entertainment AND work" because this is not a desk you can easily clear out from in front of the sofa when you stop working.
This is making me want to get some blackout curtains for my living room so I can go back to occasionally working with my laptop hooked to the projector, though. It's about the same resolution as my laptop but it's really nice to be focusing on something across the room for a change.
I use a 50" 4K TV as my monitor. It's mounted on a long TV mount that can bend at 3 points, one near the wall, one near the TV and one in the middle. Gives me great freedom. One warning to people who want to do the same: make sure your mount has a way to rotate (around the screen's surface normal) the TV as the weight of it will make it sag.
I've been using 50" 4K/60 TV's (3x actually) as monitors since 2015, and I love them. Prior from about 2007 on I used 6x 24" LCD's, and in wanting to upgrade, didn't make sense to bother with small LCD's to go vertical with another row for 12x displays. I found Samsung curved 4k LCD's at the time for around $650 each shipped around black friday, so it was a no-brainer. I've never looked back really, or would consider anything smaller now.
I am wondering how 8k displays would look replacing my current samsung 4k's as these are pre-HDR, but I'll probably use these until they start dying with no complaint. Plus no one does curved displays now, which I'll miss from my current TV monitors.
Heh, I do something similar as well, with a 48" LG 4k OLED, which seems popular with other users as well. I got this over another 4k or 8k TV because 1) OLED simply looks better and 2) 120 hz is nice for gaming, but I do want to get the same type of TV but with 240 hz instead for some of the higher twitch games.
I use Windows and the PowerToys utility which might arguably be the best window manager I've used, even about tiling window managers on Linux, simply because I can specify exactly the layouts I want for every single virtual desktop and every single app.
Overall it works well but for the first little while I did get a headache from sitting too close, but it went away soon after.
At home I use 2 28" 3:2 4k displays and in the office I use the same setup and 2 additional 24" WQXGA-Displays and I like the ability to spatial arrange windows and corresponding tasks. My mind just doesn't work the same with one huge display. I even noticed this back in the day when multiple displays meant 2 17"-19" 4:3 or 5:4 displays and the first colleagues started to use the first 30" displays with 2560x1600.
I used a 43" 4k TV to replace a multi-monitor setup, and the neck and eye strain was brutal for me. Even with a really nice display with a high refresh rate, viewing the corners from that close up was worse than useless. The brightness was difficult to tune down enough to reduce eye strain and from that close up reducing blue light through software wasn't very helpful.
I've since switched to a 32" 4k curved display (still 16:9, not ultrawide) and have been much happier. The curve makes more of the view useful from the periphery and the display has some quality-of-life features, like displaying multiple inputs as separated and ratio-configureable "monitors" in hardware. It's also nice to have controls on the display; the TV relied on the remote, and I kept losing track of it.
The only thing I miss is being able to switch to watching sports at the end of the work day, and being able to cast video to it. Those were luxuries duplicated by other things already in the house. I'd like to say I miss gaming on it but I honestly don't, it's much nicer to not have to extend the keyboard and mouse back far enough to also see the entire display at once.
I work mostly with text and code so the curve isn't an issue, and I could see designers preferring a flat panel to avoid distortion. Otherwise I'm not sure I could go back to having such a large display, much less a 65" display.
EDIT: Per another comment, I have mild hyperopia diagnosed about a year into using this setup, which continued for another year after getting glasses to correct it. My prescription has not changed since getting the new display.
I've been using a 34" 1440p curved ultrawide monitor (21:9) since 2020 and it's been amazing. Earlier this year I decided to try using a 42" LG OLED TV as my monitor and lasted about a day before deciding to go back. I 100% agree with you RE: viewing the corners of the flat screen. I'll never go back to a flat monitor/TV for my primary PC again. I think my ideal monitor is ultrawide, curved, 1440p, OLED, and 38" or so.
I've long considered going this way myself, but 8k is tricky for a number of reasons:
- I am very sensitive to glare, and all TVs are glossy
- Smallest size you can get is 55" (up to 50" would be good for me as I keep my 32-incher on a custom 8" stand — I am pretty tall — so it would simply be a wider screen that goes to my desk with top being at the same point)
- Connectivity sucks: I am so used to running only my laptop with a single USB-C connection: I had enough with early Dell MST 24" 4k screen that required 2 DP 1.1 connections IIRC (basically the same thing their 32" 8k has).
- I've mostly use Linux (Mac for work though)
So I am waiting for a monitor that can do 8k at 60Hz with an ultraportable that runs Linux and an iGPU that can drive it for productivity (software dev, browsing, video calls — yes, full screen video call is a hog at large resolutions, at least in Linux).
I'll probably sacrifice on the resolution front next (4k at 32" is not enough either) and go with a 4k option at 42-43" people have mentioned elsewhere.
Assuming that those audio speakers are at ear height (I assume they are since those IsoAcoustics stands allow tilting but there is no tilt in the picture) then IMHO the display is placed too high, ideally you want your eyes level just below the upper edge of the screen. I don't blame OP though, I just think with this type of screen size, it is challenging to achieve that.
I already use a 4K TV for a monitor. 8K would just push a need for a more expensive video card, while decreasing how well people can see when I share my screen. Even on a 4K, I need to blow it up to ridiculous zoom levels to make a screen-share readable to others.
I'm sure not everyone would run into that problem, but it is a fairly strong con to be aware of.
If you are on Linux, you can divide the entire screen into multiple virtual monitors and share only one of them. This has the benefit of giving you "private" monitors what won't be shared.
Another option could be to temporarily lower the resolution.
> 8K TVs may be driven at 8K 60 Hz with no chroma subsampling by using HDMI 2.1, which is available on all current (Nvidia RTX 4000 series and AMD 7000 series) and previous gen (Nvidia RTX 3000 series, AMD 6000 series) graphics cards. Older computers with GPUs outputting DisplayPort 1.4 may use adapters such as the Club3D one to achieve 8K 60 Hz.
Isn't "plain" DP 1.4 confined to HBR3 - thus its maximum refresh rate is 8K-30Hz?
I love my Acer Predator 43” 4K. It’s small enough that I don’t feel like I need to extend my desk to sit far enough away, and it also just squeaks under the max load for the Ergotron HX monitor arm.
It’s extremely sharp for normal use, and doubles as a 4K 120Hz monitor for gaming.
I use dual 27" 144kHz 4K monitors and am mostly pretty happy with my setup though I have considered moving to an Ultra Wide curved monitor, I'm just not sure if the OCD side of me would be bothered by the curvature.
Unless I'm misunderstanding, one of the advantages of using physically distinct monitors is that it's easier to send things into a full screen mode without affecting the other displays - I guess apps that support "borderless windows" are less of an issue.
Maybe there's some type of cross platform (Mac, Lennox, Windows) virtual display driver software that can allow you to create "picture in picture" virtualized monitors though?
>Unless I'm misunderstanding, one of the advantages of using physically distinct monitors is that it's easier to send things into a full screen mode without affecting the other displays - I guess apps that support "borderless windows" are less of an issue.
This is one of the reasons I stuck with two monitors instead of one long one when I upgraded a while back. I know there are workarounds and helper programs you can install and whatnot, but I like being able to drag something to the side and full screen it without any additional hoops. Plus the long monitor crowd tend to have things centered on the screen and then have small accessory areas to either side instead of two distinctly large screens. Plus resolution wise, unless you're going with a really wide monitor, you probably have more overall resolution with two screens, especially if price is a factor at all. Standalone 27" monitors are basically the standard and are priced accordingly.
My Dell monitor has a picture-by-picture mode which works very well to simulate 2 distinct displays. Each side uses its own video input. Many higher end monitors can do this, unsure how many TVs can.
For me, the best monitors by far for programming are LG's 28in DualUp, due to the aspect ratio. I have a pair side by side, and it's effectively four 1440p screens in a 2x2 layout, giving lots of vertical space without a bezel as well as horizonal on each screen.
The one issue that I have with using TVs as "monitors," is that they are too damn "smart." They play with the images, and it can be a devil to find all the settings, to turn them off. On my Samsung, there's a couple of things that I can't turn off.
I sorta tried this, using a single one of those large 4k curved monitors at my desk in San Francisco before the pandemic. It was alright, but I always liked having two 2k monitors more. At this point, as an Awesome WM user (there are dozens of us!), I really depend on having two different monitors so I can have two different sets of tiling window tags.
> You can even use the same TV for 4K 120 Hz gaming or watching movies as a bonus!
But you can't use the computer at the same time then. With a 3 monitor setup I can add an HDMI switch to one of them, and when I want to play, then I can switch that monitor to connect to the PS. This way I have still 2 monitors to use. Then one can be used for TV in the browser and the other one for other stuff.
Am I the only person who wants a monitor that's curved in both axes (left/right and up/down) so I can surround myself with a sphere of monitors, and then pivot on a gimbal?
it's around 100 degrees while humans can see more like 180 degrees (more if you move your eyes; I don't want to move my eyes, I want to gimbal my body to focus on a specific monitor) although outside the center of your vision, you don't have good "resolution". The Vision Pro would be like being inside the sphere, but with a big aperture blocking all the side monitors
Just got a 32" 4k. I had a 49" 4k in the past, but it broke. My issue with monitors above 49" is it strains the eyes and head looking around. I always had to partition the screen or manually resize, it got annoying. Gonna try 1 4k for landscape and 1 for portrait now.
I wanted to go down this path some months ago, but couldn't find any options on the market. I ended up with a 42" 4k LG C3, but it's just "ok" because I can easily see pixels. I wanted to use the room as dual use work/watch movies, but without the need to watch movies I'd probably go back to a wide screen curved display.
I'm not sure if they ever shipped it to any retail customers. I'm a JVC projector owner so I kinda follow JVC projector news. The higher end JVC PJs are used by Boeing for flight sims:
JVC accommodates that use case with things like extra chassis mounting points to allow the projector to be mounted securely in a dynamic environment. This looks like it may have been an early POC in native 8K for Boeing.
Ultrawide 5K2K is a great sweet spot, at least for what I do, which includes a bit of everything. I never liked dual monitors with a split in the middle. Ultrawides solve that.
I want 4k/5k and an 18"-21" diagonal, but all the hi-dpi smaller screens go to laptops and tablets, I guess. No monitors like that. Hell, under 27" and 4k can be tricky to find these days. 24" models exist but are a shrinking category.
I don't want or need my monitor to take up a huge amount of space. But I do want high pixel density. Looks like I'm in too small a market to serve.
That’s typical of tvs. The signal is delayed by a few seconds because for passive entertainment why not. You will likely have a mode for your tv that does no post processing and has minimal delay Often called pc or gaming mode. Look up “[your tv model] gaming mode”.
For 1 and 2, I would say it totally boils down to personal preference an distance/size ratio. For 3, again, distance to the screen matters a lot.
The 4th one I've seen the most heated discussions about. In my opinion, highest you can afford (both money-wise and computational power-wise) is the most useful resolution. Even if you can't distinguish the individual pixels (aka screen door effect) aliasing is still an issue.
I'd love to do this but always worried (probably incorrectly) that the energy output wouldn't feel great and result in faster fatigue or require more rest breaks.
My wife asked me how much "huge monitors" cost. I told her 100 bucks on Craigslist. Indeed, we got her an old dumb 1080p LCD and she has been super happy with it. It mostly fills the wall of her little cubby hole in our office.
For my money, I have 2x 1080p 24” displays, and a third curved 32" 1080p display which is hooked to a KVM so I can game on it.
I like the 3 monitor setup because they are all at angles from each other, approximating a huge curved display. Plus, this was a cheap setup off woot.com parts.
I think monitors are like headphones. Unless you actually try the "better" ones, you don't have a clue what you're missing. I know because I had been saying "Dual 1080p 24" is all I will ever need." for a long time until I got a 4K 50". Now I can't imagine going back.
I usually use a pair of Sennheiser HD280s that I've had for over a decade. I've used some fancier headphones costing more than an order of magnitude more, from brands such as ZMF. After experiencing the high-end advantage, I'm still perfectly happy with the 280s. There are a few things I care about in a monitor, and DPI is nowhere on the list. Every monitor commercially available has more resolution than I care about. My number one concern is consistency across a wide viewing angle. Low latency, retina DPI, gamut accuracy, HDR, curved surface? I don't care about any of them. I have tried all of them.
May I ask at what configuration? I'm assuming at least one is vertical because I can't think of a way to set 2 43" monitors horizontally without breaking my neck.
Modern TVs have decent input lag around 10 ms which is on par with professional monitors, but of course it will still be worse than gaming monitors. Lots of people game on their TVs. And most TVs have settings that disable postprocessing.
Since a couple of years ago, I spent a year or so like this, with the TV resting on the desk directly.
It looked pretty nice, but it had some problems.
- The only actual 8K modes reported on the HDMI were some variant of YUV, it means you could not select what your OS considered an RGB mode
- Even using it at 4K, with the 55" TV a couple of feet from the back of the desk, my eyes could not keep all of it perfectly in focus.
- The power consumption was much higher than a typical ~30" monitor, and the amount of heat created was also significant. This became hard to deal with in summer.
Eventually I gave up on it and returned to a ~30" monitor.
All else being equal, a TV (i.e., TV-sized) unit generally has a broader set of use cases and longer useful lifecycle than a computer monitor for the original purchaser†, which could be argued makes good economical sense.
† in my experience, computer monitors can have a long useful life when factoring in the potentially long tail of "donor/hand-me-down" cases...
I went through a phase of wanting the most possible screen estate to do sick multi tasking gimmicks like having chats, documentation, code editor, and prototype open at once. It was glorious, a 5k2k ultrawide monitor filled to the brim with a mishmash of sometimes related, sometimes unrelated windows.
Then it hit me that I can only focus on one thing at a time since I’m a human being, and having multiple attention grabbing things in front of me is never good. I now run a single Studio Display and have a code editor in full screen, switching to other content through virtual desktops. I’m WAY more productive this way.
Now I might just have a short attention span and that’s that, but using a TV as a monitor sounds like hell to me now.
I've always wondered why everybody would buy "monitors" for computer use. Isn't it the same thing as a television screen? Back then TVs used to take different inputs but everything is digital now.
That checkerboard effect is certainly interesting. Someone somewhere is going to be nostalgic about this artifact someday, maybe they'll even make a shader to emulate it. I wonder what causes it and why it disappears in game mode.
> on Linux it took about two years for 8K 60 Hz support to work, spawning a salty thread on GitHub
All I see is paying customers asking for support.
> The AMD on Linux fiasco is because the HDMI Forum has prohibited AMD from implementing HDMI 2.1 in their open source Linux drivers.
That's weird since nvidia's open source driver has an implementation.
I'm embarking on a similar geek journey. Just today I bought a used radiology PACS display (barco mdcc-6430) just to see if there is anything novel or cool about the picture or any clinical features. I'm not expecting much but stuff like this is how you find out.
This display is color, however I have considered getting a grayscale only rads display for "ADHD purposes" i.e. the same reason people are interested in e-ink displays (well, one reason).
It will probably be a huge waste of time and money but I'm just a masochist for tech pain I guess...
Nope.. I want to be able to properly split the screen in different inputs because of the lack of proper window / workspace management if you're not using separate monitors
I use a 4k TV. I've wanted upgrade to 8k for a while, but according to this post AMD on Linux can't do 8k so I guess I'm sticking with my current setup.
My 780M already struggles running GNOME at 4k, so maybe that's for the best.
I use a single curved 57" 32:9 DUHD monitor (Samsung Odyssey NEO G95NC) for work and gaming. Previously I used 3 24" monitors, but I like this setup a lot more.
I split it into 3 sections (browser for docs, and rest terminal/nvim), but i can easily change this if I want to show slack for example. For gaming I go fullscreen (and use overlays for stuff like VOIP or browsing) because it is a lot more immersive.
I used a 32" non-curved 4k monitor for a few months once. At some point I realized that I was moving my head around a lot as the corners were at an awkward place. On 28" I don't have this.
So anything above 30-ish inches I would consider either curved (expensive for hidpi resolutions) or two/three 27" screens angled a bit.
I can't imagine how bad it would be on a 65" flat screen.
> TLDR: If your job is to write code all day [...], buy an 8K TV instead of a multi-monitor setup.
Counterpoints:
• All my keyboard muscle memory is setup for multi-monitor setups. Theoretically fixable with the right tiling window manager... which I would presumably have to install, since I do too much Windows stuff to go full time Linux. Or perhaps develop. Buying more monitors is a better use of my time.
• I curve my monitors inwards, intentionally, for better viewing angles. Also lets me hide a tower in one of the corners behind the curve on a straighter desk.
• I do too much multi-machine development (e.g. testing refactoring of multi-platform abstractions.) HDMI switches are super convenient, your TV's picture-in-picture functionality... may or may not be. Dual Windows PCs for testing on nVidia and AMD simultaniously, or remaining unblocked when busy reformatting/reinstalling/compiling/linking/syncing 100GB+ on one? Yes please. It's often interactive enough to want to keep open, yet passive enough to need something else to do. OS X for iOS and Linux for debugging server code? Sure. iOS and Android? Well... those have their own monitors. Consoles don't though, and I've targeted those too..
For an entertainment setup, I can usually scrape by with 2 or 3 monitors (1 landscape for fullscreen game, others typically portrait for chat/wiki/etc). Right now, I'm on a 75" 4K chonker. I have good eyes, but 8K would be a waste of pixels, and I'm already close enough that the viewing angles are noticable. Yet, I still hauled out a second monitor: an old 2.5K to exile junk I want to monitor off the main screen.
For a development setup, I've bought or brought a 4 x 27" 4K setup if one isn't provided. A 5th monitor has occasionally been useful (1 landscape for console, 4 portrait for console IDE, devtools, devtools IDE, and docs/wiki/jira/chat/notes. Replacing the 4x portrait with 2x 8K landscape... would probably work, at least, although I'm not convinced it'd feel like much of an upgrade, if any.)
I guess. I think the important thing is getting the program in your head, not on the screen. If the code is too complicated to hold it all in your mind then more columns of crisp text will not save you.
I actually kind of agree with this. For me, the more pixels the better (I'm sensitive to fuzzy text, and subpixel rendering makes it worse), but I'd really prefer just one monitor, not too big. 15-19" is fine, especially if it's 4:3. 1600x1200 on a 17" monitor would be really nice.
TL;DR: You can't really replace a monitor wall with a single screen because it does not curve to create the right viewing angle, which makes text seriously unreadable at the edges, which forces you to seriously upscale the font size, which steals the largest amount of real estate possible. Of all the compromises to make, reducing the number of screens is one of the worst ones.
4k screens are already somewhat questionable for productivity for this reason alone. The only serious argument that can be made is 1440p vs 1080p (personally I would argue for 1080p, if using bitmap fonts and having perfect eyesight). A 4k monitor wall is a rather fringe setup, that only works out to an advantage for day traders and weird surveillance applications. And it requires that you constantly do very energetic body gymnastics to change your perspective's location and be able to see all the details. With a single 8k screen without upscaling font size (hence preserving all technical real-estate), the body gymnastics required would be so much worse than a 4k wall, it would be absolutely ridiculous and clown-alike and almost impossible to use while typing. Otherwise people mainly want big 4k/8k screens for dual use as a TV set. But this is just wrong in itself, it creates a paradox for no good reason, like using screwdrivers as chisels. Some things are not meant to be. The only arrangement where 4k makes some sense for common use cases, is maybe above a curved ultra widescreen.
I normally work with a 40", I'm using a a hammerspoon to divide the screen, but normally I end using one main window, with some smaller window at the side and cmd-tabbing between info. How do you manage the distraction of so many information at the same time? Do you switch between apps? use the mouse? don't you loose track of where the focused window is?
People and their need for a "leader". No matter the quality. We had enough "truth tellers" and "follow me men" kinda shills.
Time to realize that not everyone is your friend in the internet. They feed you bullshit all the time and laugh how gullible people are and question nothing, just follow based on perceived merits of an individual.
Huh? One screen for email/slack/.. main screen for the ide, other screen for logs etc. a lot less context switch to glance left/right than to go to another virtual desktop
Author here, ask me anything!
Apart from programming, one of the motivations for getting the 8K display is to look at lidar point clouds. For example the desktop background in my post is a lidar map of Bernal Hill in San Francisco, which I've here downsampled to only 13006 x 7991 px for your convenience [1].
Admittedly, when I bought it at first, I didn't realize there would be so many random issues, as manufacturers all advertised their gear as "8K Ready" even in 2021. As I incrementally fixed the problems, I decided to document my journey in this blog post.
btw I posted this in the past but it got caught by the spam filter and disappeared [2], not sure how to appeal that when it happens. Thanks ingve for posting it again!
[1] https://pics.dllu.net/file/dllu-lidar/tldr_707_all_c_fine_50... (13006 x 7991 px)
[2] https://news.ycombinator.com/item?id=41102135
I am using a 43 inch 4k monitor so I am allin on big screen real estate. But I find that even with a quarter of your screen area, I struggle to read the corners of the screen, the bottom is often obstructed by whatever is lying on my desk, and I had to make the mouse cursor bigger as I kept losing it. I doubt that an even bigger screen would be practical. I do have two 43in monitors side by side but the other one is more like a secondary screen for playing movies or "storing windows", it's too far from the eye to be useful as a primary monitor for reading and writing.
I had the Dell 8k monitor you mentioned, the picture quality was great but it died after a few years not long after the warranty expired (a gut punch at the purchase price) and they said too bad so sad... ok that's fine but I will never buy another Dell product again. It was released too early to have proper displayport support and I had to use a custom nvidia-driver X11 config to make it mostly work as two monitors. And there is basically no way to use that kind of DPI without scaling.
I replaced it with an LG 43UN700 which is a 43" 4K display that I use unscaled and although the LCD panel is vastly inferior I love the thing especially at the price point (under $700). I hope manufacturers continue to support this niche of large singular flat displays because they are fantastic for coding, data viewing/visualization and pitch hit at content consumption as your article states although this one would be no good for gaming. And getting a "monitor" or "professional display" firmware load means a lot less problems than a Smart TV load.
I had a similar experience with Dell after they wanted the price of a new laptop for a replacement laptop battery. This was for the Dell Studio back when battery packs were made to be swappable by simply sliding a latch.
After that phone call to customer support, I made a similar vow to never buy another Dell product. These days, I use a Framework laptop.
You couldn't buy something from eBay or TaoBao/Alibaba?
I did in fact buy a knock-off from ebay battery, but it kept it kept it's charge for hilariously little time. Had to run it of mains power permanently (ran it as a little server for a while).
FWIW, it was the same (even at the enterprise level).
We had a commodity (local cloud) computing Dell infra in the mid 2010s and were constantly replacing/returning “simple stuff” (fans, support flanges, memory, NICs).
“Dude, you’re gettn’ a Dell” became—-nope, never again.
My sitch was around the same time period.
Do you have 20/20 eyesight, and how tall are you?
I use glasses (myopia) and can kind of tolerate the edges of my 32" 4k monitor, but I can't fathom craning my neck all the way up to the edges of a 55"+ display. Not to mention font sizes.
I have fairly bad eyesight with both myopia and astigmatism (-5 sph, -2 cyl) and I wear glasses. I got glasses with 1.71 index lenses, which I greatly prefer over the more common 1.74 index lenses due to the higher Abbe number, resulting in less chromatic aberration.
Anyway, I use browsers at 150% scaling usually, although the text is finer on my terminals. I don't use any scaling for UI elements and terminals. Using the i3 tiling window manager, I put more commonly used terminals on the bottom half of the screen since I find that the top half does require more neck craning.
I'm 184 cm tall.
FWIW there are lenses that are high index while still having a higher Abbe number, but they're expensive and pretty specific materials. Interesting that 1.74 are more common where you are, where I am lower index polycarb are the standard (sadly)
You don't maximize windows except to watch videos at that size. It's more like having multiple monitors with fluid borders. You focus as needed, leaving the rest in your peripheral vision. That said I did miss maximizing windows to focus on tasks.
I had a 55" TV as my main display in 2022. Had it about a foot away from my face. It takes a few days, but your brain and body get used to the size.
I just bought a 39" ultrawide and for the first few days I thought "oh dear, I have to keep turning to see the whole thing," but I've not even thought about it for a couple of weeks now, so I guess I'm acclimated.
YMMV.
I have been using a 32" monitor for the last 10 years. I have found that I am using mostly the center of the monitor. The peripheral edges remain unused.
If I sit far from the monitor, then the FOV could be reduced, but then I have to increase the font size defeating the very purpose of maximizing screen real estate.
This is pretty much what I concluded as well after using my 43" 4K LG monitor for about 3 years. Lately I've been trying out my wife's 27" Apple Studio Display. It's smaller but the PPI is amazing...
You can achieve the same perceptual PPI by sitting further away from your 43" monitor.
The longer focal length is much better for eye health, with less stress on ciliary muscles for focusing.
https://www.tools.rodrigopolo.com/display_calc/
Isn't it good for a little exercise? Maybe we should have 300" monitors so we jog from one edge of the screen to the other as we type code :)
You sit back far enough that the TV encompasses your entire field of view, so at that point there is no need to move your neck at all, only your eyes.
Nice to see other people doing the same thing I do, albeit with a 4k OLED instead. I am waiting for an 8k OLED at an affordable price but it seems I will have to continue waiting.
What brand and model of desk do you have? I have a 48" TV but I sit rather close so it probably takes up the same field of view as your 65".
As to your last paragraph, if you email hn@ycombinator.com and explain the situation, they'll sort you out and sometimes put you into a second chance pool, as it's called.
I have the Uplift 4 leg standing desk [1].
I got the black laminate desktop in a custom 75" x 42" dimension so the whole thing cost me almost $2000.
[1] https://www.upliftdesk.com/uplift-4-leg-standing-desk-v2-v2-...
I wish deep desks were more common! Modern ultrawide curved monitors sit way too close for comfort for me due to the way their legs have to be angled further back for center of gravity. custom desks end up being so expensive.
Any good deep desks that you've found so far?
This is a cheap large desk I like.
https://www.amazon.com/dp/B0BXH2MZRM
i'm using a nice sheet of 4x8 finished plywood from the hardware store. i trimmed the depth down a bit, but not much. put some edge banding on it, and stick it on top of a flexispot or whatever other 4-legged desk frame you want to use.
How about the (in)famous Amazon Door Desk? They used to build desks for their offices from a door.
Just don't let Bezos attach the legs .. by all accounts he was a shitty carpenter at best and the wobble factor was high.
Nothing wrong with doors as desk, just get the framing solid.
https://www.aboutamazon.com.au/news/workplace/how-a-door-bec...
Half inch thick solid kiln dried door sized jarrah hardwood is even better .. but that's likely not to hand for most and expensive to order if not.
Go to a used furniture store and pick up an old school executive desk. Those are huge.
A small desk works best imho. Have the bottom of the screen well below desk height.
How did you get it in a custom dimension? I'm almost tempted to just put two of my current desk back to back to make it deeper, would probably be much cheaper than 2k, but then again, they're not standing desks.
Oh I just emailed them, and their rep Jeremy Postma is very nice and responsive.
If you want a cheaper option you could buy an IKEA Karlby Countertop that's 74" x 42" [1] and mount it on the legs yourself.
[1] https://www.ikea.com/us/en/p/karlby-countertop-for-kitchen-i...
What zoom (if any) do you typically run at? For instance, a 200% zoom would give you an effective resolution of 4K, but with much sharper and smoother text and rendered graphics.
I can't believe you're not mentioning the super high power draw of 8k monitors. It's so bad that i'm not even considering getting one.
The power is not so bad, especially compared to the graphics cards you would want to use (and I use my GPU as a tow warmer). Samsung 8k specifically comes with low power presets which are probably usable in this scenario. Of course with so many more pixels in 8k than in 4k there is need for more power but the EU regulation allows selling them if they also support an eco mode.
I am old enough to recall 100W as the typical single light bulb and I still use an electric tea kettle that touches the multi kW range daily.
https://www.tomsguide.com/news/eu-8k-tv-ban-goes-into-effect...
hey, i asked you on the other thread as well (the imac one) but this was my question
—-
Hey, I have a similar setup (https://kayg.org/uses) where I use a LG C148 as my primary TV and monitor. I do all work on it, however I am unable to use tiling window managers as you recommend because I always struggle to see windows / text that is placed above my eye-level. For that reason, I prefer to use manual window management solutions instead. I am curious how do you deal with that problem, one big TV user to another? or do you not have that problem at all? thanks!
I'm curious how much heat this thing puts off, and whether there are particular display types that generate more heat than others.
Nothing compares to the old school large dell lcds. I had one years ago that was a furnace.
Yeah it does emit a bit of heat. I think around one or two hundred watts? I haven't measured it directly. I have a mini split air conditioner in my home office.
Makes sense, also See comment about energy consumption of 8k TV above.
The comment above has very wrong numbers, by the way, typical consumption for the whole device should around or less than what that poster claims is drawn just by the CPU!
I recently acquired a 43" 4K monitor for programming - a very boring Philips monitor, used at 100% scale. I hated it at first, but after a month I loved it.
2160p actual 'workspace' resolution at this distance (2 feet?) and size (43") seems close to a practical limit for typical use I thought, requiring with this measly 43" still a little bit of occasional head movement to see the top right corner. I noticed a tendency to sit slightly to the left of centre on this monitor, to avoid distortion and maintain clarity with what I'm focusing on (e.g. code/windows, not reference materials). Because of this I suspect at this distance a 43" with a slight curve would be optimal, at least for me.
What I wanted to ask you:
- What is your 'workspace' resolution? Is it something like 6K? I'm guessing your scaling is either 125% or 150%? Your PPI should be around 135, mine 102.
- Are you actually sat perfectly centre? I was wondering this because I keep noticing I tend to gradually shift my keyboard to the left over a day. Maybe this is years of 1440p + side portrait monitor use, I'm not sure, but eventually I accepted that I prefer slightly to the left (odd because my side portrait was on the left...)
- Do you think a curved monitor at this size/distance would improve the ergonomics? I imagine you must get a bit of a neck workout.
After getting this monitor, I'm pretty much sold on single screens again - but I had to switch my window management from keyboard-based tiling shortcuts to 'hold CTRL and move mouse' window management (BetterTouchTool on MacOS), with a tendency to stack up windows messily. I tried custom resize snap zones with BetterSnapTool - but I don't use them. I think that was the biggest challenge to switch from multi monitor to large format. It's a huge benefit to have everything in your context on one screen, but had to rethink how windows get moved around. Now I'm used to it, I want CTRL/SHIFT + mousemove modifiers on every system to deal with windows.
Also related, I bought a 4K tv last weekend for another system to use as a monitor, but found that the gaps between the pixels were unexpectedly large, creating a strange optical effect at close distance, making it unusable (but so close). There might be something different about the screen outer layer (on most TVs?) that polarizes light in a way better suited for distance viewing, but clearly not all TVs have this issue.
I don't know many Linux users doing 4k+ at 144hz. I am wondering if you do any screen capture or desktop recording, and if so what software you use and what your experience is like? I cannot reliably capture 4k/144hz with my setup but my desktop environment is still on X11. I tried KDE/Wayland and had a better experience, but run into other bugs based on their integration.
Just curious how your experience with sway has been. I installed it but wasn't expecting to come with no config at all and didn't really want to be bothered setting it up just to test screen recording.
The issue with X11 is that even if you record (using any software) it causes the display refresh rate to artificially drop and its a very bad experience overall when you run at 4k144hz. Ultimately, the future is wayland but I am a little surprised how slow it has been for everyone to integrate it into their software.
Except for gaming, is there any practical use for refresh above 60Hz?
Yes. It makes the experience much better when anything is moving. Hard to convince by words, it’s a try it and then go back to 60 to see what you’re missing.
Similar to hard drive vs SSD. Before I used a machine with a SSD for the first time hard drives were fine, then my normal was conditioned to that of SSD speeds. Going back to hard drive speeds is painful, just like 60hz even for things like moving windows around the desktop.
No
They are useful for the same reason response rate is important -- motion blur and judder. Things look more crisp and move more fluidly across the screen.
To get retina quality display, you need to match the PPI right? For 5K, 27" is the sweet spot. For 8k, what would be the optimal size of TV?
What you really need to match is the angular resolution in microradians from your eye. You can make any screen smaller by sitting farther back. That said, I do wish my TV was only 42". I guess if you really want the ppi to be exactly the same as a 27" 5K screen, then 27 * 7680 / 5120 = 40.5".
This is exactly the reason I intend to stick with 4k for now: I don't want a display that large. I currently have a 48" 4k display, and I'd prefer to have a 42" or 36" one. (Good choices are hard to find, though, particularly if you actually want 4k rather than ultrawide, want OLED, and don't want to just use a TV.)
I bought the Philips Evnia which fits perfectly into that category at 42". Despite being a gaming monitor it's not garish and I've grown to love the ambilight.
https://www.philips.co.uk/c-p/42M2N8900_00/evnia-gaming-moni...
Do you have access to other lidar areas, like in Pennsylvania/Philly?
Interesting. Time to buy a new tv or monitor for programming. Wonder which resolution and size to go. Use 4K 27 for programming and a super wide for my fs2020.
Btw I would use two different glass when I use it as tv or playing fs2020/4 vs when I sit close to use it as programming station.
What do you use the lidar point clouds for?
I was previously working at a lidar company and now I am working at a robotics company providing calibration and localization software to customers using a combination of lidars, cameras, and other sensors.
Nice website.
I know your article is on 8K TVs, but it's worth pointing out that the Dell UP3218K is a 32" 8K monitor (but is also not without its own challenges).
This is already mentioned in the article:
> There is also a Dell UP3218K, but it costs the same as an 8K TV and is much smaller and has many problems. So I do not recommend it unless you really don’t have the desk space. Sitting further back from a bigger screen provides the same field of view as sitting close to a smaller display, and may have less eye strain.
I wish Dell would come out with a refreshed version of the UP3218K that's cheaper and fixes its various little glitches.
I have it and love it. The only problems I have with it are related to it needing to be power cycled if I haven’t used it for a couple days.
You COMPLETELY missed the elephant in the room : 8K TVs have really, really massive CPUs that waste a TON of power (150-200w for the CPU, 300-400w for the TV, often!) Think 8 cores of the fastest arm 64-bit processors available plus extra hardware accelerators! They need this extra processing power to handle the 8K television load, such as upscaling and color transforms - which never happen when you are using them as a monitor!
So, 8K TVs are a big energy-suck! There's a reason why European regulations banned 100% of 8K TVs until the manufacturers undoubtedly paid for a loophole, and now 8K TVs in Europe are shipped in a super-power-saver mode where they consume just barely below the maximum standard amount of power (90w) ... but nobody leaves them in this mode because they look horrible and dim!
If everybody were to upgrade to an 8K TV tomorrow, then I think it would throw away all the progress we've made on Global Warming for the past 20 years ...
Anecdotally my house draws 0.4 kW when idle and 0.6-0.7 kW when both my 8K screen and my computer are on. Since my computer draws 0.1-0.2 kW, I surmise that the QN800A doesn't draw 300-400 W total --- maybe 100-200 W.
I run my screen on a brightness setting of 21 (out of 50) which is still quite legible during the day next to a window.
Also, I have solar panels for my house (which is why I'm able to see the total power usage of my house).
RTINGS reviewed the Samsung QN800A as consuming 139W typical, with a 429W maximum: https://www.rtings.com/tv/reviews/samsung/qn800a-8k-qled#tes...
The parent comment is completely wrong on nearly every point it makes. I don't know why it's so upvoted right now.
It doesn't even pass the common sense test. Does anyone really think TVs have 200W CPUs inside just to move pixels around? That's into the territory of a high-end GPU or server CPU. You don't need that much power to move some pixels to a display.
Surely they'd be using image processing ASICs instead of CPUs anyway, hence why they don't draw that kind of power!
I didn't smell anything. A 200W PSU isn't terribly expensive and being cheaper than more efficient processors seems reasonable. I also only run a single 4k monitor so haven't thought about driving 4x the pixels recently.
> I didn't smell anything. A 200W PSU isn't terribly expensive and being cheaper than more efficient processors seems reasonable
200W is the realm of powerful GPUs and enthusiast/server CPUs.
Common sense would rule out an 8K TV requiring as much power as an entire gaming GPU just to move pixels from the HDMI port to the panel.
Maybe these TVs are using salvaged Pentium4 CPUs...
That's a facially absurd statement. Just on the numbers:
The US consumes 500 gigawatts on average, or 5000 watts per household.
So if every household bought an 8K TV, turned it on literally 100% of the time, and didn't reduce their use of their old TV, it would represent a 10% increase in power consumption.
The carbon emissions from residential power generation have approximately halved in the past 20 years. So even with the wildest assumptions, it doesn't "throw away all the progress we've made on Global Warming for the past 20 years ...".
How does it compare to working from home as opposed to driving to the office?
E.g. let's say I drive 10 miles a day to get to the office vs use an 8k TV at home?
If I go out of my way to work from home, would I be ethically ok to use 8k monitor?
Back of the napkin it seems like 8k monitor would be 10x better than driving to the office?
The parent comment is wrong. The Samsun QN800A uses 139W typical, with a peak around 400W according to reviews https://www.rtings.com/tv/reviews/samsung/qn800a-8k-qled#tes...
To put it in perspective, an electric car might need 350 Watt-hours per mile. A 10-mile drive would use 3.5 kWh. That's equivalent to about 24 hours of using that monitor at normal settings, or about 8 hours at maximum brightness.
The comparison doesn't make sense, though, because if you drove to the office you'd still be using a monitor somewhere. A single 4K monitor might take around 30-40W. Using four of them to equal this 8K display would come in right around the 139W typical power consumption of the 8K 65" monitor.
I don't think this is an honest question.
There's no "fixed budget" of energy that is ethically ok to use. The parents point was that these devices are woefully inefficent no matter which way you look at them.
The "best" thing to do would be neither, and is usually to just use the device you have - particularly for low power electronics as the impact of buying a new one is more than the impact of actually running the thing unless you run it 24/7/365
> There's no "fixed budget" of energy that is ethically ok to use.
Not even 0.00001 W? How is it ethical to live in the first place in such case?
> The parents point was that these devices are woefully inefficent no matter which way you look at them.
It's always a trade off, of productivity, enjoyment vs energy efficiency, isn't it? If I find a setup that allows me to be more productive and enjoy my work more, certainly I would need to balance it with how much potential waste there is in terms of efficiency.
> The "best" thing to do would be neither, and is usually to just use the device you have
That's quite a generic statement. If my device is a budget android phone, do you expect me to keep coding on it, not buying better tools?
FWIW (just to clarify on this one area if I may)
>> There's no "fixed budget" of energy that is ethically ok to use.
> Not even 0.00001 W? How is it ethical to live in the first place in such case?
The idea of no fixed budget is there is no binary threshold. Where above is bad and below is okay. Just a spectrum.
I’d be interested in hearing you reconcile your statement by making an ethical case for the energy use of video games and the hardware that runs them.
Is there a level of energy that is ethically okay to use for video games?
Read up on the 2000 watts society
> the impact of buying a new one is more than the impact of actually running the thing unless you run it 24/7/365
This is wrong for basically everything we ever use - from a house, to an electric car. And especially for small items like electronics
> You COMPLETELY missed the elephant in the room : 8K TVs have really, really massive CPUs that waste a TON of power (150-200w for the CPU, 300-400w for the TV, often!)
RTINGS measured the Samsung QN800A as consuming 139W typical, with a peak of 429W.
Your numbers aren't even close to accurate. 8K TVs do not have 200W CPUs inside. The entire Samsung QN800A uses less power during normal operation than you're claiming the CPU does. You do not need as much power as a mid-range GPU to move pixels from HDMI to a display.
> There's a reason why European regulations banned 100% of 8K TVs
This is also incorrect. European regulations required the default settings, out of the box, to hit a certain energy target.
So large TVs in Europe (8K or otherwise) need to come with their brightness turned down by default. You open the box, set it up, and then turn the brightness to the setting you want.
> until the manufacturers undoubtedly paid for a loophole
This is unfounded conspiracy theory that is also incorrect. Nobody paid for a loophole. The original law was written for out-of-the-box settings. Manufacturers complied with the law. No bribes or conspiracies.
> If everybody were to upgrade to an 8K TV tomorrow, then I think it would throw away all the progress we've made on Global Warming for the past 20 years ...
The Samsung QN800A 8K TV the author uses, even on high settings, uses incrementally more power than other big screen TVs. The difference is about equal to an old incandescent lightbulb or two. Even if everyone on Earth swapped their TV for a 65" 8K TV tomorrow (lol) it would not set back 20 years of global warming.
This comment is so full of incorrect information and exaggerations that I can't believe it's one of the more upvoted comments here.
> RTINGS measured the Samsung QN800A as consuming 139W typical, with a peak of 429W.
Can you explain why does a TV's power fluctuate so much? What does peak load look like for a TV? Does watching NFL draw more power than playing Factorio?
Most likely brightness. Turn the brightness to the maximum value and power will go up a lot.
Great aspect to consider, thanks for raising it.
This is a bit much.
The average American household uses about 29 kilowatts of power per day (29,000 megawatts).
The difference between using a 4K screen and 8k screen for 8 hours a day is about 100 megawatts difference.
I wouldn't get over excited about something increasing overall energy usage ~2.5%.
> The average American household uses about 29 kilowatts of power per day (29,000 megawatts).
Ignoring the megawatts error that the sibling pointed out, it's 29 kilowatt hours per day. Watts are a unit of power consumption -- joules (energy) per second.
One kilowatt hour is the energy used by running something at 1,000 Watts for one hour.
I think you mean 29,000 W (watts), not megawatts (MW), which are 1000 kilowatts (kW).
To be fair it's not the energy that you're concerned with; it's the source of that energy.
Private jets can't run off nuclear power grids. Also the real problem-child of emissions is not America. China has a billion more people, what are their TVs like?
Good points. I would go further and say it is the integral of emissions over time that we would be most concerned with. From that perspective, over the last 200 years - it is problem children, and rising problem childs.
I also recommend this. Been using 43inch 4K TV since last 10 years. MY first TV (Vu 43inch Iconium) died this year, Got another 43inch 4K TV from LG (43UT8050) now. Ensure you get one that supports atleast 60hz refresh rate. My first one did not at 4K. It even starts faster than android TVs. I always keep on game mode, this setting ensure minimal input latency and no TV side post processing. The smart TVs don't need to be connected to internet, since I don't use the smartness of theirs. Finding dumb TVs is difficult here.
> The bezels and gaps in between the monitors introduce distractions and one is limited in how one may arrange terminals and windows across multiple displays.
To me, the segmentation is a feature. It lets me offload information density and focus. For example, I commonly have an editor on one screen, a browser on the second, and something like a chat app, terminal, etc on the laptop screen.
Nobody's stopping you from segmenting one big monitor into different regions; and you get to choose how big those regions are from day to day rather than being forced into it.
They tend to be relatively poorly handled by the software, at least out of the box.
Every modern major OS now has some level of tiling/splitting on a monitor's edges baked into their window manager by default now. Some can be tweaked to split into smaller subgroups, but that often requires less well tested/polished options (some apps just ignore the hints), or even third party extensions.
That's too much extra work. With multiple monitors you can maximize primary apps while still having manual management of smaller supporting apps on another monitor. You also get more edges for rapid snap to the sides of a monitor.
Even Windows has this with PowerToys.
I use 3 monitors but would switch to 1 if games respected the dimensions and 8k displays could refresh at 360Hz.
You're not using the best window managers - most have customizable drag points or keybindings to get exactly what you want.
Since you seem to know about the best window managers, can you recommend one for MacOS which will let me direct focus to whichever window is left/right/down/up of the currently selected one? i3/sway does this just fine, but my impression is that MacOS's api doesn't allow third party developers to pull it off, but I'd love to be wrong about that.
Not the person you were asking, but after years of using i3, AeroSpace is the only way I can use a Mac productively, and does indeed have the feature you're describing.
Wow. Thanks for this. I've bounced off a number of tiling WMs on MacOS over the years - Amethyst, Yabai, others I can't remember - but Aerospace is really excellent. Can't believe I've never heard of it before. Love the custom implementation of spaces as a solution to what ails a number of other tiling WMs. I installed it this morning and disabled the mish-mash of Rectangle Pro, Better Touch Tool and OS kb shortcuts I'd been using.
the article mentions https://github.com/koekeishiya/yabai, which i have been using for a few years to get me to 90% parity with i3.
it has quirks and limitations, some of which can be fixed by disabling system integrity protection but it can definitely handle window tiling and navigating with keybindings when you use the companion daemon https://github.com/koekeishiya/skhd
I use yabai which does what you say and more pretty well. It also lets you completely remove spaces transition effect but this will require disabling of SIP.
Although if that big monitor is an OLED, segmenting it into halves or quarters is kind of begging to end up with a line burned in down or across the middle eventually.
I'm imagining a tiling window manager screensaver that slowly resizes and moves your window boundaries throughout the day
Many OLED monitors and TVs already do this, it's called pixel shifting.
A Panasonic Viera plasma TV I bought around 2012 also had this feature.
Samsung solves this in the TV itself. It can be annoying when the edges of the screen are ever so slightly off, but i'm glad I don't have to worry about it. QCQ90S. I wouldn't recommend it since the tv's gui is glacially slow, but then again all the ones I tried last year were.
In theory, that would be the solution. In practice I read about all sorts of weird edge cases and bugs when trying to do that.
Some discussions over here:
https://www.reddit.com/r/ultrawidemasterrace/
I am (҂`_´)
Continuously micro-manage the layout of individual application windows your window manager is a form of procrastination.
Back to work!
Anything fullscreen? How are you segmenting sub displays easily in that scenario?
I have done this in the past using a tilling window manager and it's still better to use different displays. There is something about our monkey brains that makes 'different physical object = do different things' work better than all having it on the same monitor.
I did get it to work for me with thick black bars between the screens, but when you're giving up an inch of screen real estate for every virtual monitor then you might as well get physical ones.
I use a single ultrawide at home and dual-monitors at work.
Initially thought having one monitor experience was more seamless, but I do miss implicit window organizational aspect that dual monitors provide. And screensharing on the ultra-wide is a pain.
My Samsung ultra wide has side by side mode with two input cables. Screen sharing (and Windows) thinks it’s two monitors but I can stretch windows all the way across both if I want to since it is an extended set
Best of both worlds, I wish there were a way to configure this within the OS so that you could make a single screen appear like 2, 3, or 4 logical screens.
A decent window management tool (e.g. Rectangle.app) should resolve most of your window management issues - set up many drag points to easily divide windows by half, thirds, quarters, sixths, etc.
Most screen share apps should support sharing by window. Also best for privacy (so your viewers don't see the side channel chat notifications pop up).
Also an ultrawide monitor is preferable for spreadsheet warriors.
I will not give up my 49" 21x9 for anything lesser.
FancyZones does exist to help with some of this if you're on windows:
https://learn.microsoft.com/en-us/windows/powertoys/fancyzon...
If your ultrawide is anything like mine, it also has a setting that lets it register as two separate monitors (PIP/PBP mode), which is like having two monitors without the bezel, but with the convenience of "there's an edge" in the middle of your screen when doing regular desktop work.
Does require two cables of course, but if you're driving an ultrawide, you're probably using a graphics card with three or four outputs anyway.
The Dell Ultrasharp 43 4K monitor [1] has a mode where it pretends to be four monitors, one in each quadrant.
> four unique FHD partitions via Internal Multi-Stream Transport (iMST) when connected to a single PC
That's nice for people to organize their windows without needing to figure out a tiling window manager. If only it was 8K instead of 4K...
[1] https://www.dell.com/en-us/shop/dell-ultrasharp-43-4k-usb-c-...
Same, I've never liked spanning a window across multiple monitors. The discontinuity of the bezel is a handy mental break. Often I'll have email and teams on one screen and my main item of work on the central screen.
Same. This utility is also multiplied by having a separate set of virtual desktops on each display, which lets one create sets of windows/apps that can be mix-matched between screens, reducing the amount of window-shuffling to almost nothing after initial setup.
This is only possible under macOS and Linux, unfortunately. On Windows virtual desktops are still kind of a weird hack that spans one desktop across all monitors.
Yeah that issue seems weird to me, because I've never found bezels themselves to be that much of a problem. Like sure, less bezel is better. But I have some pretty wide gaps in my work monitors, and I've never found it to be a problem.
This article, and a lot of "productivity" articles, feel like spending a lot of time and effort for marginal-at-best improvements. I don't know their specific workflow, but I'm pretty sure they could get basically the same amount of productivity with a handful of 1080p monitors.
After 15 years of having a desk job I find that I’m more sensitive to the position I sit in. My back feels a lot better if I have a single, regular sized screen right in front of me, instead of having additional screen estate on the sides or below (as with a laptop).
At the same time I use virtual desktops that I can switch with both keyboard and mouse.
The general advice is to have top of monitor at eye level, but it's been wrong advice for me personally. I now put the middle of the monitor at eye level. Keeps my head up and posture better. Leaning back instead of stooping.
The general advice provided to me, and relayed by me is eyes centered @ 2/3th of the screen. The best advice received and relayed by me regarding posture might surprise you. If you struggle with posture, stop caring about what other people might think about your posture. Changing/Tweaking posture all the time might look bad, but it also tends to mitigate the effects of being frozen in bad posture(!) The health impact is too significant to ignore.
Yeah I think the only ergonomic advice I believe anymore is that there does not exist a position that is ergonomic to sustain for more than a couple hours. Humans are not evolved to stay stationary, few mammals are really.
I do this too, though mostly out of necessity. I use a 27" screen a couple feet away. To get the top of the monitor level with my eyes I'd either have to lower it so the bottom of the monitor was almost flush with the desk (which my current monitor's stand won't do anyway), or get a taller chair/lower my desk, both of which would leave my legs rubbing up against the desk underside and my arms at an uncomfortable angle for typing.
Either I have an abnormally short torso, or that advice was written back when most people were using a 14" display.
I switched to a VESA arm so I can put the bottom of the monitor flush with the desk and leaned back at a bit of an angle. It’s fantastic.
Indeed. AIUI your head needs to be back, chin tucked in, which means looking down a bit. If you're looking level or up you're going to be sticking your head out a bit
I'm the same. I use a single 27" 4k monitor and use virtual desktops. The best upgrade for me though was getting a computer prescription for some glasses that I keep on my desk.
Sometimes I think about upgrading to a 5k monitor. The Apple Studio Display looks great, but I'm a Windows user and I'm guessing a lot of the nice features of that display are Mac-only.
There aren't a whole lot of options for 5k monitors. Other than Apple I think there's a Dell, but it's too wide. There's a Samsung but I've been burned by Samsung too many times. There's also an LG 5k monitor but it gets pretty weak reviews.
> The Apple Studio Display looks great, but I'm a Windows user and I'm guessing a lot of the nice features of that display are Mac-only
I can possibly be of some help here. I have a Studio Display, however my work-provided machine is a Dell laptop and so that is what is connected to it most of the time.
Providing your machine can output video via Thunderbolt or USB-C, it will work. That is fairly common these days, though Windows machines capable of driving a 5120x2880 signal can be harder to come across, particularly in the corporate laptop world, though I don't know how much of a concern that is to you.
My last work machine maxed out at 4K which the Studio Display would happily scale up to full screen. I would describe it as substantially sharper than e.g. a 2560x1440 display of equivalent size, but still noticeably less sharp than the full native 5K (obviously). My current machine can do the full 5K, but the performance leaves a lot to be desired (however the thing is a turd anyway, too much corporate security crap bogging it down).
Speakers, camera, and microphone built into the display all work totally fine from Windows. What may be a total non-starter is that you need a Mac or iPad to change the brightness, because there's no physical controls on the display itself and Windows doesn't expose a way to control it. I am lucky/unlucky in that my home office does not get a huge amount of natural light, meaning I've been able to set it to a comfortable brightness from my Mac and then just leave it.
Overall it's a very nice monitor if you can work around the brightness thing. A possibly better contender though is the recent-ish 5K variant of the Asus ProArt[0]. I was using the 1440p version of the same monitor before I got the Studio Display, and I was very happy with it. Good colour reproduction, USB-C Power Delivery for one-cable laptop docking, and a far more adjustable stand than the SD. Worth a look.
[0] https://www.asus.com/displays-desktops/monitors/proart/proar...
I've got the LG 5K and it's been totally dependably kick ass for the 4 years (i think) since I got it (from the Apple Store). Mostly using it on macOS but have used it with Windows and haven't tried with Linux.
Agreed. To each their own, but the obsession with the biggest and/or most possible screens is something that is very hard for me to relate to. As soon as I am regularly craning my neck to see all of my screen real estate, it is no longer a positive in my life. I'm glad these solutions exist for people who enjoy them, but they are definitely not for me.
Same here. I only use and want a single monitor setup. I can alt-tab between windows faster and more comfortably than turning my head to another screen.
Also a dual/multiple setup bothers me for losing the mouse boundaries when it crosses to another screen - I'd rather have the mouse bounded on one screen for faster access to menu bars at the edges.
I used to use dual monitors 50:50 in front of me, but after a few years I started getting neck pain.
Now I put a monitor directly in front of me, and a secondary monitor on the side. No more neck pain.
Same, and wherever I put the second display, it's going to hurt my neck after a very whort while.
Same for me. I just tried a curved 27 inches monitor and I hate it.
That's too small for the curvature to provide any benefit.
I find curved beneficial even on 24”, but I’m also quite nearsighted.
I have dual 27 flat monitors at home and dual 27 curved monitors at work and the ones at work are far more comfortable to use.
yeah I'm fine with just an ultra wide, no more stacked monitors for me
Have been sporting a 4K LG CX48 OLED since ~Sept, 2020 best monitor decision ever. I've got two HDMI out cables, 1 going to my gaming rig and the other for my Macbook where I do my work as a developer.
I haven't noticed any burn-in or dead pixels. You need to set it up for success, enable all the burn-in prevention settings the monitor provides (static image darkening, pixel shifting&cleaning). It's also a great idea to do other things such as sleeping the monitor after 1min if inactivity, no screensaver (or just black), black desktop background, hide taskbars, etc
edit: to add, i have the monitor mounted to the wall and about 1" above the height of my desk[1] - this puts the center of the screen directly at eye level
[1] - https://i.postimg.cc/nhqvM4Yz/62395566614-66-C9-BCAA-367-C-4...
I stole one of these from Best Buy for $500 in march. It’s just so good. I haven’t turned off the local dimming thing with the service remote so that’s still a thing but damn is it such a great monitor. And for gaming cyberpunk at 120hz with hdr melts your face.
crazy how cheap these got, I paid ~$1500 USD in 2020
I upgraded recently, by buying a friends old Samsung Odyssey G9 49" curved monitor off him (he was emigrating). Before that I had 2 x 27" monitors, a setup I had used for ~10 years.
I honestly think the curve is essential when dealing with such a wide display. The alternative would be - as article states - to set it back a little and have a deeper desk so you can actually see the edge of the screen properly. I don't see the point in having a large screen with high pixel density if the edges are not actually easily visible to me without moving my head or body laterally.
The lack of bezels is great though - I'd definitely agree on that front, having 3 web browsers or editors open side by side suits me really well.
It’s different from person to person!, whether the curve is good or not.
I have a ruler flat 55” OLED TV as main monitor. It’s perfect for me. I’m like… 1-1.5 meters from it where I’m closest to it, haha. The edges are further away. It’s fine! – imo / ime.
(The need for the curve is also subtly different depending on how the panel was made. I tried a flat 43” IPS 4K monitor, expecting IPS to be good. And it wasn’t very good. The IPS features in that panel were large enough to affect viewing angle.)
> It’s different from person to person!, whether the curve is good or not.
The amount of curve also varies a lot between models so there's some nuance even within that. The curve might be as strong as 800R or as weak as 2300R depending on the monitor, where the number corresponds to the radius of the circle the panel follows in millimeters.
I have the 57" version, 7680x2160. It's ... indispensable ... all my Konsoles, app windows, etc. all on one screen with no overlaps.
Got it on a Samsung sale for ~$1500 IIRC, one of the best upgrades I'd ever done.
Same, though I'm also on 49" (5120x1440). They're selling them for extra cheap on Amazon with extended (36mo) warranties because they're prone to breaking, but I had the Samsung contractors out here this month and they did a great job fixing mine that randomly died one day -- for free! If you're a chill soul, I'd say it's worth the risk.
I sound like a shill, so Samsung plz hmu. $999 for a beautiful OLED monitor that fits a terminal, a browser, and 4 (font size 8...) 100col text editor windows is a gamechanger.
As weird as the aspect ratio can be on a curved ultrawide, I think it's also more natural and ergonomic to keep your head/eyes at a constant height and just move them side to side. With a monitor that has a lot of verticality you're gonna have to tilt your neck back more.
Why are these monitors sold as "gaming" monitors?
Low response time (i.e. time it takes for a pixel to change color) to reduce ghosting, and a high refresh rate up to 240 Hz.
These monitors are expensive and do not have very high resolution. If you're not a hardcore fast reflex gamer, and you spend a lot of time looking at text, then IMO it's better to buy a higher resolution monitor for less money.
These days you can buy 4K/240Hz displays that have a 1080p/480Hz mode.
Even 240Hz is usually enough for really good players. 480Hz is just for the 0.01% who can take advantage of it.
https://www.amazon.com/ASUS-Swift-Gaming-Monitor-PG32UCDP/dp...
https://www.amazon.com/LG-32GS95UE-Ultragear-DisplayHDR-Dis...
https://www.amazon.com/LG-32GS95UV-Ultragear-DisplayHDR-Dis...
https://www.amazon.com/Predator-Monitor-FreeSync-Premium-100...
> 480Hz is just for the 0.01% who can take advantage of it.
I'm skeptical that any human can take advantage of that. Even 240Hz is stretching it.
I have a 144hz monitor and a 240hz. I can definitely see the difference.
I think at that point it’s not really conscious any more? It always takes me a little while to realize my monitor somehow went to 30hz, and that’s why I’m feeling something is off.
Note that these are all 32" panels so the PPI is on the lower side of 4K monitors.
If you want pixel density first and speed second then you should go for a 27" 4K instead.
Or just sit a bit further, you will get the same exact FoV.
4K gaming monitors do provide a reasonable middle-ground between "extremely fast but only 100-110ppi" and "extremely high res but only 60hz" now though. You can get 163ppi at 144hz without breaking the bank, which isn't quite retina by Apples definition, but it's good enough for me considering the benefit of high refresh rate.
I'm guessing because it allows you to set the Field-of-Vision to be pretty wide?
I mostly play simulation games, particularly flying, and having a wider FoV makes things easier, until you're ready to go to the top step of using VR instead so you also get depth perception and essentially 360 FoV since you can rotate your head.
A curved, very wide fov screws up the camera projection for most games though.
I wonder what the math would look like to properly render 3D scenes onto a curved display. Could it be accelerated as well as the regular matrix operations used for perspective projection onto planar screens?
During the pandemic I did try out my 4K TV as a game monitor. I had a combination of furniture so that I could sit rather close with my eyes approximately half way up the screen, with a keyboard and mouse in a reasonable position. Then, using an older FPS game I got it to where my laptop GPU could hit good frame rates and I adjusted the game's viewing angle to match how the screen fit my field of view.
It was deeply immersive in spite of me being so close I could "see the pixels". The only time I've felt more immersive was demoing Quake in a 3 wall + floor CAVE at a national lab decades ago.
> I wonder what the math would look like to properly render 3D scenes onto a curved display. Could it be accelerated as well as the regular matrix operations used for perspective projection onto planar screens?
The math is pretty simple to account for a curved viewport, even though I don't think any apps actually care about that. Most displays aren't curved enough to make it a meaningful difference.
We don't have fixed function pipelines anymore either so that could definitely be handled by hardware.
This used to be much more true, but almost all PC games support 21:9 now and 32:9 support pretty common too. "most games" screwed up is an exaggeration IMO. Even on games that don't officially scale, on PC they almost always have customizable FoV that gets the perspective correct again. Many modern games are even smart enough to rearrange the UI so that the critical info (health bars, ammo counts etc) is in the center of the display and not attached to the edges.
PC games have kinda been forced to support ultrawides whether they like it or not - the 21:9 class especially has exploded in popularity for gaming PCs.
I've gamed in 32:9 for years now - I wouldn't go back. The curve is not exaggerated enough to be a meaningful projection issue on most curved displays and games.
It's the curve that messes things up. It's just significantly more incorrect on wider displays. Many monitors are 1800R, and that's easily curved enough for the projection error to be quite pronounced at 32:9 using a planar projection.
32" Odyssey G7 is the pick for me, I wouldn't mind an upgrade to the 4k version, but the 1440p version is more than good enough.
I also don't see the point in having a screen so big I have to move my head, or contrarily a screen so big that I have to push it back so the pixel density matters much less.
According to https://tools.rodrigopolo.com/display_calc/, a 65" 8K like the one in the article is retina at a 26" viewing distance (136 PPI). For reference, a 27" 4K screen has 163 PPI, and is retina at 21" by the same math. A 27" 5K (like the Apple Studio Display) has 218 PPI and is retina at 16".
The DPI of this screen is too low for all the drawbacks. Would rather have crisper text (150+ DPI, 200 preferable) and/or be able to carry it myself. Needs to be about 42" for that.
The biggest problem I see is ergonomics.
The proper monitor height is when the top third of the screen is at or slightly below your eye level when seated or standing upright. This positioning helps prevent neck strain and allows for a comfortable viewing angle.
The top third of a large TV will be much higher than that, which will cause long term discomfort.
That's why large monitors have much wider aspect than TVs.
Yep a huge monitor sounds good in theory but you end up with neck and eye strain from panning your head constantly unless you place it so far away that it’s effectively a regular monitor at a regular distance.
Would recommend a black background Vscode theme for an OLED. The black background with red accents looks beautiful, at least on my smaller XPS 15 4k OLED. I use Dobri Next Black with some customizations but it looks good by default as well.
Nice, I use Hyper Term Theme. I'll have to check out the one you mentioned.
I got my 8k 55" tv for under 1000 usd several years ago. Brand new, from a brick and mortar electronics store. So it is definitely possible to make 8k monitors for less than 1000 usd.
A mere 55" with 8K resolution makes no sense as a TV, but it's glorious as a productivity monitor. But instead of becoming commonplace as monitors, the panels seem to just be disappearing even as TV's. At the moment I can't find anything at any price that can replace my current setup.
The market isn't working for monitors. Everything available now is either crap, or costs 10x more than it clearly could. Millions of people are spending years of their lives in front of bad screens because monitor makers don't want to make good ones.
I feel like Apple's 30 inch 6k display would be the sweet spot for me, but its 60 hz and cost what.. $6,000 ? I just use 27" 4k monitors for work. It's fine but I'd definitely like something a bit bigger and even crisper. I have to use windows for work though.
I love this blog post.
“It can display seven equally spaced vertical columns of text (critical importance), has driver issues (minimal importance), wake issues (who cares), it costs as much as four smaller monitors (this is good), I need a huge desk (hell yeah), there are multiple image quality issues (well it’s not like I have to look at it all day)…”
It is like “I spent fifteen hundred dollars on a multitude of hassles due to purchasing the wrong type of display, but due to the lack of bezel this is a prime efficiency move “
I chuckled at "The 8K display is only $1500 at BestBuy!" the "only" lol I spent $400 on my projector that I use for my main screen and it works great. But when I did that I had previously only bought $200 projectors. So even that was not an "only" for me.
I've never spent more than $75 on a monitor. I only buy used. Monitors depreciate like crazy and businesses are constantly getting rid of them, even when they're only a few years old. Yeah, you aren't going to get some 9001Hz 10K giga-OLED whatever, but I'm a programmer. If it displays text with reasonable contrast without hogging my whole desk, it does everything I need it to do.
The most expensive one - the $75 one - is a 24" 1920x1200 IPS display with HDMI, DP, VGA, 2x DVI, S-Video, and YPbPr composite. Never seen those last two on a monitor before, but there they are. I don't use that display as my main one anymore, but I keep it around because it's awesome and it plugs into literally anything.
Reality warps when you and everyone you know pulls $200k+ annually.
May I ask what projector? I’m thinking about getting one as well
It's an 8k projector?
Remember dropping a grand on a 30- inch 2560x1600 on the day and thinking that was the ultimate.
I The 40 to 45 inch is the ideal, otherwise screen real estate goes too far in the peripheral vision.
The other issue was a lot of really big screen. Real estate is managing lots of Windows. With dual screens you can usually been in Mac's ride of applications more easily than with one cuz when you Max on the super big screen it just takes up everything.
And pushes the usually the most relevant stuff is the upper left hand corner that goes to the upper upper left left corner, which actually is pretty far out of your main field of vision.
But I still love the 43-in 4K TV I've been using since 2010 or so
Is it a dumb projector?
> Multiple image quality issues
Only the first one (dirty screen) is a real issue, but it is subtle and irrelevant to programming; the second one (checkerboard), as the post explains, is solved by toggling an option in settings.
> Driver issues
The post explains that it works perfectly with current NVidia drivers on Linux, and on Windows both AMD and NVidia on Windows have had driver support for HDMI 2.1 for years.
Once you no longer see pixels you'll never want to go back.
I'd give a lot to go back to my 20 year old eyes that could see pixels without special glasses. Sure I can't see pixels (well maybe I still could on an janky third party CGA monitor from 1983), but it isn't worth it. (I'd say save your eyesight, but realistically I'm not aware of anything you can do to keep it past about 45)
I've used both. I quite honestly don't care. I've heard many people that share your sentiment. But some of us just don't. Visible pixels are totally fine for me.
Don't you long for the warm glow of CRT phosphors??
I went back from using different displays in HiDPI to using a single 43” 4K screen set to 100 % scaling. Screen estate trumps invisible pixels [for me, at the moment].
I think you'd have to sit further back than is otherwise natural (and then have the issue of legibility/lost workspace) to achieve "can't see the pixels" on this.
Sure it's 8K but it's 65", it's only got a PPI of 135. For comparison Apple (computer) displays and a handful of third parties that target Mac use are generally 200-220 PPI. That is can't see the pixels density, even if you smash your face against it.
220 ppi output with no subpixel rendering (ie modern Macs) has clearly visible jagged edges in angled lines and letters if you've got good vision or correct your vision to better than 20/20 (my case: I get headaches if I don't).
If you are coming from typesetting world, laser printers from the early 1990s did 600dpi (dots per inch), and that remains sufficient for smooth lines, though newer printers will do 1200dpi too. Going down to 300dpi printouts is crap.
Heck, newer Kindles do 300ppi and that can clearly be improved.
Apple's "retina", like all things in life, does work for 90% of the human population as advertised, but there's still a big number of people who have better angular resolution than what they target.
I have a 55" 8K and I can't see the pixels while sitting 2ft away. Everything is crisp and I have a huge workspace. For mac I use 4k native so 2x integer scaling.
I didn't see any mention of how many times he has to pick up his mouse when it gets to the edge of the pad to get the mouse from one edge of the screen to the other.
Author here: I use a Logitech G Pro X Superlight but also I use the i3 window manager and rely on keyboard shortcuts for a lot of the navigation. I have the mouse sensitivity set so that the cursor can traverse the width of the screen when moving the mouse about 13 cm, without any acceleration. This is still precise enough that I can move the mouse pixel by pixel if needed.
That's easily solved with mouse sensitivity settings, it doesn't matter the size of the screen if you set it properly.
With all that text, I'm hoping their religion is keyboard shortcuts.
i3/sway was recommended in the post, so yes.
Maybe they use a marble mouse or something like that.
I find it annoying that they've kind of got rid of mouse, tails and other easy ways of finding the mice pointer.
That's one of the main drawbacks of a massive screen is if you lose the pointer it takes a lot longer to find it. It's not a linear scale based on the width of the monitor. It's with the square.
So 50-in monitor is going to be about four times longer to find a mouse pointer than a 30-in one.
I don't like those hotkeys where you know it highlights it. I like the the mouse tail. That's the one that I can most easily find it. But generally those came out of fashion about 15 years ago
Pointer trails are still a feature in windows last I checked, and hitting ctrl to animate a circle around it works pretty much everywhere. I don't use either of these features nowadays, and usually find my cursor by moving it until I get to somewhere with high contrast.
I haven't seen anything I like quite as much for quickly finding the cursor as macos's "wiggle for giant cursor" feature.
Who needs a pad? And who needs the mouse to move more than 2 to 3 cm max, with dynamic acceleration? I'm just doin' 0.5cm twitches most of the times.
The example he's chosen is of a ridiculously sized TV. 65" is living room TV size.
There are smaller, OLED displays that would be more suitable(while still rather big). Many are 'just' 4k, but the smaller sizes should give one a decent pixel size.
I actually spent $3500 on mine haha, back in 2021. Early adopter tax...
yeah but I kind of get it..
Absolutely. I remember watching Swordfish and wanting Stanley’s setup
I've been using 42" 4K TVs as my monitor for like 10 years now. 2 years ago I upgraded to an OLED LG A1 and it has been amazing.
https://www.rtings.com/tv/reviews/lg/a1-oled
For anyone using a TV I recommend using https://github.com/waydabber/BetterDisplay to properly scale the display.
I have a 43" LG 4K TV as my main screen for the last two years, it's great. I'd actually like something just a little bit bigger, 50" maybe?
The trick for me was to wall-mount it and get a deep desk. I prefer to be at least 36" away from it.
Have you had issues with image retention? I also like the 43” 4K setup for some things, but these days it seems IPS screens in that size are not as easy to find, I’ve always been wary of OLED due to burn-in
Looks great, thank you for sharing. I've been looking for something similar (OLED, for work/gaming) so looked into the one you're using.
> Doesn't support variable refresh rates or HDMI 2.1.
Unfortunately that makes it a deal breaker. The search continues
The C1/C2/C3/C4 do support HDMI 2.1 and VRR.
I love my 65" LG GX OLED as daily driver for work and gaming. See https://www.theshepreport.com/p/the-shep-report-holiday-tech... I came from an ubutto revolution cockpit setup with 4 monitors, the ergonomics were awful.
With the LG I'm about a meter or less away from screen and use window management tools to pull focus to the center lower section for any focused work. I run Win 11 from an RTX3080 card with a 2.1 HDMI cable. 3840x2160 120Hz.
For gaming I just use windowed mode and use the full width of the 65" but just the lower half usually for COD or FPS games. I don't notice any eye strain or other issues but do run everything I can in dark mode including using the browser with the Dark Reader extension.
Direct Link with photos https://www.theshepreport.com/i/139215541/lg-gx-oled-monitor...
Do you foresee any AR/VR headset/glasses replacing your monitor within the next year?
I’m doing something like this in my current home setup, but the thing I miss most about multi-monitor is screen sharing on Zoom.
I used to be able to just share one entire monitor and could drag windows I wanted to make visible to that display. Now I tend to share single applications, and have to unshare and reshare to change the view.
First world problems and all, but it would be nice if Zoon let you partition off a part of a display (instead of all or nothing). Would love to draw a bounding box of “share everything in this box.”
I don’t think this annoyance is enough to make me go back, but there are times when I’ve considered it.
Deskpad might be what you’re after! It’s a virtual display in a window, you can share that instead of your whole screen but still get multi-app flows captured
https://github.com/Stengo/DeskPad
If you're on Linux, xrandr can probably partition your monitor into an arbitrary number of displays (may not work with Wayland)
Zoom supports sharing only a part of the screen if you tab the advanced tab in the screen sharing dialogue.
I can do that in zoom in a mac, you select a box and it shares that specific box.
I use a 48" OLED with my MacBook Air M3 and for me that is a near ultimate web development experience both on desktop and when travelling:
https://bsky.app/profile/benhouston3d.bsky.social/post/3l7li...
I mentioned this here on Hacker News just yesterday, but most respondents were appalled that it was only a 4K monitor: https://news.ycombinator.com/item?id=41988340
48 inches at 4K is such a low pixel density, are you not bothered by how bad text looks compared to high DPI screens?
You might have posted a wrong link for the second link, which is the same as this HN post.
Thx. Fixed it!
Same. I used to have a 56" OLED but it was a tad large. 48 is perfect. The hardest part was buying a good desk mount.
It doesn't work great with an all white screen but I use dark mode for most things
The list of issues / caveats seems pretty significant compared to "I have a small bezel between my screens".
From experience with a 55” 4K OLED as main monitor, I can attest that the length if the caveat list is not indicative of the total impact of the caveats. It’s more an indication of a thoughtful and thorough person writing the list.
I am looking for a 55” 4K OLED. Do you have a recommendation? And are there any technical caveats with it? (I use a Mac primarily). Thank you
I went with the LG CX model based on what I read on rtings.com
That’s a previous-generation model. I think all of the LG TVs are good.
There are / were technical caveats. I believe all of them are solved by M3 macs that have HDMI 2.1 ports. (M3 or M3 Pro or something? The ones advertised as 8K capable.) Out of the box, those will do 4K 120Hz HDR with variable refresh rate and full 444 color. This is what you want.
It is possible to get that going on older machines, except for VRR which is more of a nice-to-have anyway.
I have a 2018 Macbook Pro 15”. Disclaimer!: My setup was a “complexity pet”, a tinkering project; There are simpler ways to connect a 120Hz 4K HDR HDMI 2.1 display to a non-HDMI-2-1 mac. And! My tinkering project wasn’t only about getting the display working correctly. It was more about messing with eGPUs and virtualization and stuff. Definitely a long way round.
On my Intel mac, I use an AMD Radeon 6800 XT eGPU with Club3D or CableMatters DisplayPort-to-HDMI 2.1 adapters. Plus some EDID hacking which is easy to do.
EDID is how the display identifies itself to the OS. The EDID payload can be overridden on the OS side. Mostly it’s about copying the display’s EDID and deleting the entry that says the display can accept 4:2:0 color. Only then does macOS switch to 4:4:4 color. I also created a custom “modeline” with tighter timing to get 120Hz going fully.
—Please be assured that this was way more complex than it needed to be. It was for fun!
There are much easier ways to do this. Lots of forum posts on it. On the MacRumors forums iirc? User joevt is The Man.
And even then, what I wrote above is actually easy to do once you know it’s possible.
Mostly though you really want an M3 Mac that just has HDMI 2.1 and is ready to go.
There are/were also OLED gaming monitors available, such as from Alienware. Those have DisplayPort inputs and are ready to go with almost any older Mac. Might be able to find one for a price equivalent to a TV, idk.
The issue with the text rendering would frustrate me a lot.
And if the solution is to sit further away, why not just get a smaller screen and sit closer?
I believe the discussion about text rendering is referring only to a line of very cheap TVs that do not in fact have RGB pixels. They have half RG and half GB. For "normal" video content, this is a surprisingly low quality drop. For high-contrast text it's total murder. You can see the stippling pattern as clear as day and it can easily render 8-10pt text literally illegible.
IT once accidentally bought such a TV and had it in a conference room. Took us a while to convince the relevant people that, yes, it is nominally working fine, it's not "broken" in the sense that it doesn't turn on or half the screen won't light up, but it was intolerable for Zoom screen shares.
But you need to be scraping the bottom of the barrel to end up with those screens. I doubt you could find something labelled a "monitor" that has that, and, well, if you're putting a $150 40" TV on to your computer... I mean... what did you expect?
(There are also low-end TVs that are still using some crappy LCD techs with bad viewing angles that may make them difficult to use up close, but I wouldn't call that a text rendering problem... those issues just wreck everything. I once had a laptop that when used on a lap, had zero viewing angles; if the vertical middle of the screen was correct, the top and bottom was extremely visibly color shifted. Even the cheapest store brand TVs don't seem to be that bad anymore, though.)
> I believe the discussion about text rendering is referring only to a line of very cheap TVs that do not in fact have RGB pixels.
It also comes up with very expensive OLED monitors, which do usually have true RGB or WRGB pixels, but their subpixels are usually not arranged in the standard horizontal RGB stripe which breaks most implementations of subpixel font rendering. With a sufficiently high pixel density it doesn't matter, but with the ~108ppi of a 27" 1440p OLED monitor the text rendering can be quite visibly worse than a 27" 1440p LCD.
What's wrong with text rendering?
> TVs may have a different subpixel layout than monitors, so small text may suffer fringing. As of writing the Samsung VA and LG IPS panels such as the QN800A have a conventional RGB or BGR subpixel structure. One may also increase the font size or use hidpi scaling which will eliminate all pixel-level concerns.
There seems to be very few options for HiDPI smaller 8k displays. I only know of the DELL Ultrasharp and it costs way more than 8K TVs
The dell hasn't been updated for dp 2.0.
Back in my daze at Boeing, I had a full size drafting table in addition to the usual desk. I've always wanted a display that big. In fact, I want my entire desk surface to be such a display!
The 8k monitors are progress!
This clearly calls for dual 8k displays! One as the desk (under a solid layer of glass), and one as the monitor.
The under glass display should have Wacom and touch support too.
I like the cut of your jib!
That's simply too big a screen to be sitting right in front of.
I do agree on the basic idea of not running two monitors tho. I used to, and I got neck pains eventually.
My current setup is a single 32" curved QHD monitor and I wouldn't change it for the world. It's just the right size so you can see the whole screen at once, yet large enough to run 3 browsers side by side.
Also, I want to suggest people to learn about virtual desktops rather than wasting money on bizarrely huge screens or multi monitor setups.
> Also, I want to suggest people to learn about virtual desktops rather than wasting money on bizarrely huge screens or multi monitor setups.
This sounds a little condescending especially when the author is clearly a technically savvy user who uses a tiling window manager.
> Also, I want to suggest people to learn about virtual desktops rather than wasting money on bizarrely huge screens or multi monitor setups.
If I want to have multiple things open and be able to glance at them at once, how would virtual desktops help with that?
If you have it setup right you can flip to the other desktop quick, see what you want and flip fast. I haven't seen a good virtual desktop implementation since around 1998 though, and have given up.
55" is not too big. Maybe it's too big for you, but I've been using three 32" 4k screens in portrait for many years, combined they are essentially about the size of a 55" screen. I love it and anything less kind of sucks. No, virtual desktops are no substitute for having more screen size. I use virtual desktops on my massive screen(s) and I love that too.
Sounds like a recipe for neck pain.
The 3 32” screens are probably angled around you and the total aspect ratio is extreme widescreen (side to side panning, not vertical neck up down panning). The 3 screens are likely much much better ergonomically.
55" was fine but I'm happy I downsized to 48"
As a developer who routinely experiments with large OLED panels for programming, a new big monitor has to be:
* Curved (flat is eye strain above ~40 inches)
* 8k.
* Flicker free / PWM free.
* Not glossy.
Otherwise, I'm in!
Ready to retire my fleet of 2560p x 1440p vertical mounts.
Note: Combined these effectively form a curved 5760p x 2650p monitor with vertical bars, heh.
Just looked and I'm not seeing ANY 55" 8k around right now. All the ones I'm seeing now are 65" at the smallest, and that seems way too big.
I've found a 43" 4k is about the right pixel size for me; a 49" 4k was too big on a desk. Fortunately, it died and I could get a smaller replacement.
I suspect a 49/50" 8k would be ideal, but I'm not going to hold my breath.
>The AMD on Linux fiasco is because the HDMI Forum has prohibited AMD from implementing HDMI 2.1 in their open source Linux drivers.
I wonder why this didn't stop nvidia.
https://www.reddit.com/r/linux_gaming/comments/uoxtsx/the_nv...
I used a tv as a monitor for a while and it was great -- but there is one problem with single monitor setups -- screen sharing/recording. If the app you're using lets you select a portion of the screen to share, that's great. But something like Slack you either share an app window, or the entire screen. This is very annoying in a single monitor setup. It would be amazing if you could select a part of your screen and tell the OS "treat this area like a separate monitor".
There's a number of tools on MacOS and Windows that let you do this.
Examples:
Windows: https://github.com/tom-englert/RegionToShare
Mac: https://github.com/Stengo/DeskPad
Oh nice, I was just looking through for WIN32 apis to make something like RegionToShare. This saves me some time!
I have used one of the original 4k TVs-as-a-monitor ( https://www.avsforum.com/threads/review-of-the-seiki-39-4k-d... ) as my central monitor (plus one on each side) for 10+ years now. Not feeling any need to upgrade (don't do graphics/games, just lots and lots of text terminals and browser windows)
On most monitors I've been using these days, I keep scaling the resolution down. I've noticed that the bigger the text, the more comfortable my eyes feel. I still prefer a good high-res monitor because it scales down with less blur
This is what I do too, then I can sit even further away. Feels good on the ole' eyeballs.
I am excited for 8k monitors in the future, because they give you a lot more options for integer scaling than current 4k displays.
I know this a nerdish hill to die on, but I hate fractional scaling with the blazing fury of a thousand suns. To get a 1440p sized UI on a 27" 4k display, you can't just divide by 1.5x the OS has to 3x/2 for every frame. OS X does this best as they've had retina displays for a while, but no OS does this well, and it leads to all sorts of performance issues especially when dealing with view ports. Linux is especially bad.
Having said all that, I absolutely will not be using an 8k tv as a display. I'm currently using a 27" 1440p monitor, and while I could probably handle a 32" 8k display that is the absolute max size I'd tolerate. You start to get into all sorts of issues with viewing distance and angle going larger.
My 27" 1440p is fine for now. I sit far enough away from it that I don't really 'see the pixels' unless I go looking for them. It was also a crazy good deal as it's a 144hz monitor that also has a built in KVM switch that's very useful for WFH.
I am curious as to what OS's you've tried. Fractional scaling is flawless on Windows and KDE6 with wayland in my experience.
I wouldn't describe any OS as 'flawless', they're all doing what I describe under the hood. QT does have better support than GTK atm. I've also seen bad behavior on windows, esp with older apps. OS X is about the best out there, but even it can have issues with applications that have a view port (i.e. video editors, etc).
I'd prefer to skip all that so I'm happy staying on 1440p until 8k monitors are where 1440p monitors are today with regard to price and quality.
It may well be doing what you described under the hood, but I've never seen any evidence of a performance problem as a result.
That's why we have so-called 5k monitors in 27" size class? Being exactly 2x pixel density of conventional 1440p
27” 1440p at 100% is too small for me, so 5K at 200% has the same problem. More generally, the available PPIs combined with integer scaling only yield relatively few options at a given viewing distance. More choice would be nice.
Yep pretty much, and 6k 32" monitors. Both are fringe monitors mainly used by mac people.
You need 400% DPI scaling to make this usable.
Reminds me of Michael Stapelberg's 8k Monitor Setup:
https://michael.stapelberg.ch/posts/2017-12-11-dell-up3218k
https://michael.stapelberg.ch/posts/2020-05-23-desk-setup
Beware of backlight offsets. TV panels can have smaller backlights, because they're meant to be viewed from further away, and my LG 46 monitor didn't have backlight behind the lower 2-3 rows of pixels and a couple pixels on the left and right, when viewed at my desk. This may not impact some people, but I often go full screen text and missing some of the left and bottom pixels was annoying. I ended up able to configure i3-gaps so that it never displayed anything in those areas, solving the problem. It worked great as a huge monitor otherwise.
In my experience when a flat monitor gets too large the edges tend to be too much further than the center and as I glance around, my eyes need to refocus too much. That’s why I vastly prefer curved screens. I currently use a 4K 32in curved monitor by MSI and for me it’s just perfect
I'd be happy to, but there aren't any 8K TVs at 55" or smaller. I want the pixels, but I'm not going to put a 65" TV on my damn desk -- I have two 27" 4k now, and it's ... fine, I guess? but I want a 42" 8k running at 2x.
If you’re looking for a monitor with high pixel density and a ton of real estate, you can also buy a monitor. 5k2k’s are pretty sweet. I’m driving one of these nowadays and it’s fabulous, without all the quirks of adapting a huge TV for computer use: https://www.dell.com/en-us/shop/dell-ultrasharp-40-curved-th...
HiDPI, two 4k monitors without a bezel, 120Hz, and no need for a separate thunderbolt hub.
I'm not a fan. Large ultra-wide curved screens are fantastic. With large flat screens that are meant to be viewed across the room, you get a distorted image when you sit up close. Your eyes have to focus further away as you look at things closer to the edge of your screen and the viewing angle for that part of the screen is different from the center of the screen. It also requires more effort for your eyes to look up and down rather than left and right. We're hard wired for that horizontal plane. This makes ultrawide screens a really comfortable option.
I almost bought an 8k 55" screen for use as a monitor, but I tested a 55" 4k screen for a week and the flatness is what turned me off to it. I've been using three 32" 4k screens in portrait, arranged in a "curved" config on my desk (2 monitors on each side are mounted at an angle), which I really like. But switching to a large single flat screen was not fun.
For me the holy grail of monitors is a 55" 8k curved screen. Not "ultrawide", I want the full width and height and I want it curved, with full 8k resolution. Maybe someday, but I'm not getting my hopes up too high.
Spherical or cylindrical?
I'm not the guy you asked but I have a similar opinion on flat screens. Personally I'd want spherical. ~15" tall and ~25" wide is about my limit for flat screens, anything beyond that I find that the corners/edges are too distant/distorted. My home setup is multiple independent 27" screens, which I like. My work setup is a single flat ultrawide (34" probably?), and I find myself physically leaning my head/body from side to side when I have two windows open next to each other. I have eye level a few inches from the top of the screen, and the lowest couple inches also seem distant/distorted.
"4K is for programmers" from 10 and half years ago:
https://tiamat.tsotech.com/4k-is-for-programmers
https://news.ycombinator.com/item?id=7035030
This is something I've wanted to do for a while! I wish Samsung still produced their 55" 8K displays-- 8k @ 55" gives you effectively the same PPI as a 27" 4K display. Maybe someday.
That's a hell of a desk. And counter to the argument that "you could just have the one huge screen for entertainment AND work" because this is not a desk you can easily clear out from in front of the sofa when you stop working.
This is making me want to get some blackout curtains for my living room so I can go back to occasionally working with my laptop hooked to the projector, though. It's about the same resolution as my laptop but it's really nice to be focusing on something across the room for a change.
I use a 50" 4K TV as my monitor. It's mounted on a long TV mount that can bend at 3 points, one near the wall, one near the TV and one in the middle. Gives me great freedom. One warning to people who want to do the same: make sure your mount has a way to rotate (around the screen's surface normal) the TV as the weight of it will make it sag.
Pretty wild that the only reasonably sized 8k Monitor is going to be 8 years old next spring. Nothing coming close was ever released after that.
The Dell UltraSharp UP3218K
That's XBox Scorpio old.
Just hoping for some 32 inch 8k OLEDs driveable @120hz before my eyesight detoriates.
I've been using 50" 4K/60 TV's (3x actually) as monitors since 2015, and I love them. Prior from about 2007 on I used 6x 24" LCD's, and in wanting to upgrade, didn't make sense to bother with small LCD's to go vertical with another row for 12x displays. I found Samsung curved 4k LCD's at the time for around $650 each shipped around black friday, so it was a no-brainer. I've never looked back really, or would consider anything smaller now.
I am wondering how 8k displays would look replacing my current samsung 4k's as these are pre-HDR, but I'll probably use these until they start dying with no complaint. Plus no one does curved displays now, which I'll miss from my current TV monitors.
If its not too much of an intrusion, can you share a picture of your setup?
Heh, I do something similar as well, with a 48" LG 4k OLED, which seems popular with other users as well. I got this over another 4k or 8k TV because 1) OLED simply looks better and 2) 120 hz is nice for gaming, but I do want to get the same type of TV but with 240 hz instead for some of the higher twitch games.
I use Windows and the PowerToys utility which might arguably be the best window manager I've used, even about tiling window managers on Linux, simply because I can specify exactly the layouts I want for every single virtual desktop and every single app.
Overall it works well but for the first little while I did get a headache from sitting too close, but it went away soon after.
At home I use 2 28" 3:2 4k displays and in the office I use the same setup and 2 additional 24" WQXGA-Displays and I like the ability to spatial arrange windows and corresponding tasks. My mind just doesn't work the same with one huge display. I even noticed this back in the day when multiple displays meant 2 17"-19" 4:3 or 5:4 displays and the first colleagues started to use the first 30" displays with 2560x1600.
I used a 43" 4k TV to replace a multi-monitor setup, and the neck and eye strain was brutal for me. Even with a really nice display with a high refresh rate, viewing the corners from that close up was worse than useless. The brightness was difficult to tune down enough to reduce eye strain and from that close up reducing blue light through software wasn't very helpful.
I've since switched to a 32" 4k curved display (still 16:9, not ultrawide) and have been much happier. The curve makes more of the view useful from the periphery and the display has some quality-of-life features, like displaying multiple inputs as separated and ratio-configureable "monitors" in hardware. It's also nice to have controls on the display; the TV relied on the remote, and I kept losing track of it.
The only thing I miss is being able to switch to watching sports at the end of the work day, and being able to cast video to it. Those were luxuries duplicated by other things already in the house. I'd like to say I miss gaming on it but I honestly don't, it's much nicer to not have to extend the keyboard and mouse back far enough to also see the entire display at once.
I work mostly with text and code so the curve isn't an issue, and I could see designers preferring a flat panel to avoid distortion. Otherwise I'm not sure I could go back to having such a large display, much less a 65" display.
EDIT: Per another comment, I have mild hyperopia diagnosed about a year into using this setup, which continued for another year after getting glasses to correct it. My prescription has not changed since getting the new display.
I've been using a 34" 1440p curved ultrawide monitor (21:9) since 2020 and it's been amazing. Earlier this year I decided to try using a 42" LG OLED TV as my monitor and lasted about a day before deciding to go back. I 100% agree with you RE: viewing the corners of the flat screen. I'll never go back to a flat monitor/TV for my primary PC again. I think my ideal monitor is ultrawide, curved, 1440p, OLED, and 38" or so.
I've long considered going this way myself, but 8k is tricky for a number of reasons:
- I am very sensitive to glare, and all TVs are glossy
- Smallest size you can get is 55" (up to 50" would be good for me as I keep my 32-incher on a custom 8" stand — I am pretty tall — so it would simply be a wider screen that goes to my desk with top being at the same point)
- Connectivity sucks: I am so used to running only my laptop with a single USB-C connection: I had enough with early Dell MST 24" 4k screen that required 2 DP 1.1 connections IIRC (basically the same thing their 32" 8k has).
- I've mostly use Linux (Mac for work though)
So I am waiting for a monitor that can do 8k at 60Hz with an ultraportable that runs Linux and an iGPU that can drive it for productivity (software dev, browsing, video calls — yes, full screen video call is a hog at large resolutions, at least in Linux).
I'll probably sacrifice on the resolution front next (4k at 32" is not enough either) and go with a 4k option at 42-43" people have mentioned elsewhere.
Assuming that those audio speakers are at ear height (I assume they are since those IsoAcoustics stands allow tilting but there is no tilt in the picture) then IMHO the display is placed too high, ideally you want your eyes level just below the upper edge of the screen. I don't blame OP though, I just think with this type of screen size, it is challenging to achieve that.
I already use a 4K TV for a monitor. 8K would just push a need for a more expensive video card, while decreasing how well people can see when I share my screen. Even on a 4K, I need to blow it up to ridiculous zoom levels to make a screen-share readable to others.
I'm sure not everyone would run into that problem, but it is a fairly strong con to be aware of.
If you are on Linux, you can divide the entire screen into multiple virtual monitors and share only one of them. This has the benefit of giving you "private" monitors what won't be shared.
Another option could be to temporarily lower the resolution.
I run an 8K monitor on a 240$ GPU.
> 8K TVs may be driven at 8K 60 Hz with no chroma subsampling by using HDMI 2.1, which is available on all current (Nvidia RTX 4000 series and AMD 7000 series) and previous gen (Nvidia RTX 3000 series, AMD 6000 series) graphics cards. Older computers with GPUs outputting DisplayPort 1.4 may use adapters such as the Club3D one to achieve 8K 60 Hz.
Isn't "plain" DP 1.4 confined to HBR3 - thus its maximum refresh rate is 8K-30Hz?
https://en.wikipedia.org/wiki/DisplayPort#Resolution_and_ref...
He mentions this adapter: https://www.amazon.fr/Club3D-CAC-1087-DisplayPort-4K120Hz-8K.... With DSC 1.2, you should get 8K at 60Hz.
11 years ago, "4K is for Programmers" - https://news.ycombinator.com/item?id=7035030
I love my Acer Predator 43” 4K. It’s small enough that I don’t feel like I need to extend my desk to sit far enough away, and it also just squeaks under the max load for the Ergotron HX monitor arm.
It’s extremely sharp for normal use, and doubles as a 4K 120Hz monitor for gaming.
Coatings that don't cut down on reflections is the biggest issue I've had the various times I have used this route.
I use dual 27" 144kHz 4K monitors and am mostly pretty happy with my setup though I have considered moving to an Ultra Wide curved monitor, I'm just not sure if the OCD side of me would be bothered by the curvature.
Unless I'm misunderstanding, one of the advantages of using physically distinct monitors is that it's easier to send things into a full screen mode without affecting the other displays - I guess apps that support "borderless windows" are less of an issue.
Maybe there's some type of cross platform (Mac, Lennox, Windows) virtual display driver software that can allow you to create "picture in picture" virtualized monitors though?
>Unless I'm misunderstanding, one of the advantages of using physically distinct monitors is that it's easier to send things into a full screen mode without affecting the other displays - I guess apps that support "borderless windows" are less of an issue.
This is one of the reasons I stuck with two monitors instead of one long one when I upgraded a while back. I know there are workarounds and helper programs you can install and whatnot, but I like being able to drag something to the side and full screen it without any additional hoops. Plus the long monitor crowd tend to have things centered on the screen and then have small accessory areas to either side instead of two distinctly large screens. Plus resolution wise, unless you're going with a really wide monitor, you probably have more overall resolution with two screens, especially if price is a factor at all. Standalone 27" monitors are basically the standard and are priced accordingly.
My Dell monitor has a picture-by-picture mode which works very well to simulate 2 distinct displays. Each side uses its own video input. Many higher end monitors can do this, unsure how many TVs can.
I haven’t figured out the trick to make OSX use the entire screen on PBP mode. I just get two little screens.
For me, the best monitors by far for programming are LG's 28in DualUp, due to the aspect ratio. I have a pair side by side, and it's effectively four 1440p screens in a 2x2 layout, giving lots of vertical space without a bezel as well as horizonal on each screen.
I have a 32 inch 4k. Its good but even that feels a little big some time. I have a 55 at work and its just silly.
I have a 28" 4K 144hz monitor since two years now, its been great.
Good show.
The one issue that I have with using TVs as "monitors," is that they are too damn "smart." They play with the images, and it can be a devil to find all the settings, to turn them off. On my Samsung, there's a couple of things that I can't turn off.
I sorta tried this, using a single one of those large 4k curved monitors at my desk in San Francisco before the pandemic. It was alright, but I always liked having two 2k monitors more. At this point, as an Awesome WM user (there are dozens of us!), I really depend on having two different monitors so I can have two different sets of tiling window tags.
> You can even use the same TV for 4K 120 Hz gaming or watching movies as a bonus!
But you can't use the computer at the same time then. With a 3 monitor setup I can add an HDMI switch to one of them, and when I want to play, then I can switch that monitor to connect to the PS. This way I have still 2 monitors to use. Then one can be used for TV in the browser and the other one for other stuff.
Am I the only person who wants a monitor that's curved in both axes (left/right and up/down) so I can surround myself with a sphere of monitors, and then pivot on a gimbal?
Apple Vision Pro would probably accomplish this
What are its viewing angles?
it's around 100 degrees while humans can see more like 180 degrees (more if you move your eyes; I don't want to move my eyes, I want to gimbal my body to focus on a specific monitor) although outside the center of your vision, you don't have good "resolution". The Vision Pro would be like being inside the sphere, but with a big aperture blocking all the side monitors
Just got a 32" 4k. I had a 49" 4k in the past, but it broke. My issue with monitors above 49" is it strains the eyes and head looking around. I always had to partition the screen or manually resize, it got annoying. Gonna try 1 4k for landscape and 1 for portrait now.
The checkerboard pattern is from not using VRR. You need to enable game mode and select VRR as the refresh rate in the OS settings.
I wanted to go down this path some months ago, but couldn't find any options on the market. I ended up with a 42" 4k LG C3, but it's just "ok" because I can easily see pixels. I wanted to use the room as dual use work/watch movies, but without the need to watch movies I'd probably go back to a wide screen curved display.
Televisions continue to be the best deal going: https://ourworldindata.org/grapher/price-changes-consumer-go...
That chart doesn't even fully account for the increased size, pixel density, color accuracy, contrast, and refresh rates.
Recently tried it. Couldn't find one with a high enough PPI for my liking.
Ended up with two vertical 4k monitors. Side bonus: I can put my web cam dead center in front of me.
I'm much more interested in going the other direction, in order to get a TV without all the crapware.
How about a projector? There are no 8K projectors
Well that's not technically true...
https://www.jvc.com/usa/pro/projectors/dla-vs8000g/
But obviously not affordable.
Discontinued:
https://www.projectorcentral.com/JVC-DLA-VS8000G.htm
I'm not sure if they ever shipped it to any retail customers. I'm a JVC projector owner so I kinda follow JVC projector news. The higher end JVC PJs are used by Boeing for flight sims:
https://www.boeing.com/defense/support/training/constant-res...
JVC accommodates that use case with things like extra chassis mounting points to allow the projector to be mounted securely in a dynamic environment. This looks like it may have been an early POC in native 8K for Boeing.
"8K TVs tend to start at around $1500 to $2000 for a 65” one. This is about the same as getting four 32” 4K monitors."
Getting 3 32" 4k monitors is still better than having a single point of failure. But also I'm extremely happy with my single Odyssey G9 55".
You're so right about the single point of failure.
I bought a 30" monitor back in 2008 when that constituted a large monitor. It had a 12 month warranty and died after 13 months. :-(
I switched to 2 24-inch monitors which cost less, had more total pixels, and most importantly I no longer had that single point of failure.
do some people find it pleasant to pan their head?
maybe my eyeballs don't rotate far enough?
Ultrawide 5K2K is a great sweet spot, at least for what I do, which includes a bit of everything. I never liked dual monitors with a split in the middle. Ultrawides solve that.
Which brands/model consist of your setup? Did this turn out to be more affordable or less than expected. Thanks upfront.
My 4K 55” monitor mostly serves me quite well, but I’ve been a bit annoyed by the low pixel density. Wonder if my Macbook can drive an 8K one.
Were do 5k monitors fit in the current 4k+ future?
It feels like VR is the future of work that benefits from lots of display space.
I want a 8K 27". Density is important to me.
I want 4k/5k and an 18"-21" diagonal, but all the hi-dpi smaller screens go to laptops and tablets, I guess. No monitors like that. Hell, under 27" and 4k can be tricky to find these days. 24" models exist but are a shrinking category.
I don't want or need my monitor to take up a huge amount of space. But I do want high pixel density. Looks like I'm in too small a market to serve.
10 years ago it was the 30" Seiki 4K TV
https://www.youtube.com/results?search_query=seiki+4k
Niche no more.
Does any one else have mouse lag with giant high pixel density monitors?
That’s typical of tvs. The signal is delayed by a few seconds because for passive entertainment why not. You will likely have a mode for your tv that does no post processing and has minimal delay Often called pc or gaming mode. Look up “[your tv model] gaming mode”.
Check your refresh rate. With my 4k TV, the mouse gets laggy if it falls back to 30hz.
I plug my TV directly into my laptop with a USBC -> HDMI cable. Docking stations often fall back to lower refresh rates.
Usually TVs now come with a game mode that doesn't do post processing and lowers latency.
It might be your port or cable that doesn't support the required bandwidth to drive your hi-dpi display at the selected refresh-rate.
I just bought the EU version of this in 55inch and candidly wish I got the 65. The 55 I have to run it scaled and my mac crashes daily to it.
I didn't even knew 8K TV and monitors existed.
I am still on dual fullhd display and was considering a single 4K or 5K display vetween 27 and 32".
I think the questions to ask are:
1. At what size and resolution are flat screen monitors most useful?
2. At what size do curved screens start becoming useful?
3. What is the upper limit for useful screen sizes?
4. Is there an upper likit for useful resolutions?
For 1 and 2, I would say it totally boils down to personal preference an distance/size ratio. For 3, again, distance to the screen matters a lot.
The 4th one I've seen the most heated discussions about. In my opinion, highest you can afford (both money-wise and computational power-wise) is the most useful resolution. Even if you can't distinguish the individual pixels (aka screen door effect) aliasing is still an issue.
I'd love to do this but always worried (probably incorrectly) that the energy output wouldn't feel great and result in faster fatigue or require more rest breaks.
I use dual 43" 4k TVs as monitors. It's fantastic.
My wife asked me how much "huge monitors" cost. I told her 100 bucks on Craigslist. Indeed, we got her an old dumb 1080p LCD and she has been super happy with it. It mostly fills the wall of her little cubby hole in our office.
For my money, I have 2x 1080p 24” displays, and a third curved 32" 1080p display which is hooked to a KVM so I can game on it.
I like the 3 monitor setup because they are all at angles from each other, approximating a huge curved display. Plus, this was a cheap setup off woot.com parts.
1080p is a tiny monitor in today's standards. It's also very similar to the old SXGA resolution that was very common in the late 1990s / 2000s.
1080p is good enough for me. I'm not sure buying 3 4k monitors is going to improve my life any, what with my middle-aged eyes and all.
Also, old stuff lacks shady smart features. Bonus!
I think monitors are like headphones. Unless you actually try the "better" ones, you don't have a clue what you're missing. I know because I had been saying "Dual 1080p 24" is all I will ever need." for a long time until I got a 4K 50". Now I can't imagine going back.
I usually use a pair of Sennheiser HD280s that I've had for over a decade. I've used some fancier headphones costing more than an order of magnitude more, from brands such as ZMF. After experiencing the high-end advantage, I'm still perfectly happy with the 280s. There are a few things I care about in a monitor, and DPI is nowhere on the list. Every monitor commercially available has more resolution than I care about. My number one concern is consistency across a wide viewing angle. Low latency, retina DPI, gamut accuracy, HDR, curved surface? I don't care about any of them. I have tried all of them.
I checked with my wife and she is unsympathetic to this idea.
Ahhh....1280x1024 on a 19" LCD in 2001, it felt like a 4K monitor does today.
1600x1200 on a 21" CRT was king. though.
1080p at 32"? Dear god man have some self respect. Not everything on a screen is meant to look exactly like Tetris you know.
Some people actually don't care. I'm one of them. I express my self-respect in ways other than my screen's resolution.
The treasure of my retro gaming collection is a 720p 32" CRT. It must be 100lbs.
May I ask at what configuration? I'm assuming at least one is vertical because I can't think of a way to set 2 43" monitors horizontally without breaking my neck.
...why?
Why not?
Is there a plugin for mac to make tiled windowing easier. All the current ones are a bit too hacky. I really liked tiling in PopOS.
I've been using AeroSpace with AutoRaise since coming back to a Mac after years on i3/sway.
Rectangle is pretty good.
The linked article suggests yabai.
I dislike TVs for their high input lag; bad image uniformity; unwanted post-processing and a high energy use (hot rooms).
Modern TVs have decent input lag around 10 ms which is on par with professional monitors, but of course it will still be worse than gaming monitors. Lots of people game on their TVs. And most TVs have settings that disable postprocessing.
I'm sure a few have tried this before, but no one has given me a good argument for convincing the partner.
Since a couple of years ago, I spent a year or so like this, with the TV resting on the desk directly.
It looked pretty nice, but it had some problems.
- The only actual 8K modes reported on the HDMI were some variant of YUV, it means you could not select what your OS considered an RGB mode
- Even using it at 4K, with the 55" TV a couple of feet from the back of the desk, my eyes could not keep all of it perfectly in focus.
- The power consumption was much higher than a typical ~30" monitor, and the amount of heat created was also significant. This became hard to deal with in summer.
Eventually I gave up on it and returned to a ~30" monitor.
FYI nowadays 8K TVs support true RGB 8K 60 Hz over HDMI 2.1 with no chroma subsampling.
Buy once, cry once.
All else being equal, a TV (i.e., TV-sized) unit generally has a broader set of use cases and longer useful lifecycle than a computer monitor for the original purchaser†, which could be argued makes good economical sense.
† in my experience, computer monitors can have a long useful life when factoring in the potentially long tail of "donor/hand-me-down" cases...
But the other potential uses of a TV assume it's not tossed on a desk in my office.
You get the partner something shiny too!
At least he has proper speakers to go with that ridiculous screen!
Does a 8K 42 inch option exist?
No.
Main use of 8k is really high pixel density.
In a perfect world I'd have smart glasses that would display arbitrary resolutions that you could move, minimize and expand at will.
I went through a phase of wanting the most possible screen estate to do sick multi tasking gimmicks like having chats, documentation, code editor, and prototype open at once. It was glorious, a 5k2k ultrawide monitor filled to the brim with a mishmash of sometimes related, sometimes unrelated windows.
Then it hit me that I can only focus on one thing at a time since I’m a human being, and having multiple attention grabbing things in front of me is never good. I now run a single Studio Display and have a code editor in full screen, switching to other content through virtual desktops. I’m WAY more productive this way.
Now I might just have a short attention span and that’s that, but using a TV as a monitor sounds like hell to me now.
I've always wondered why everybody would buy "monitors" for computer use. Isn't it the same thing as a television screen? Back then TVs used to take different inputs but everything is digital now.
That checkerboard effect is certainly interesting. Someone somewhere is going to be nostalgic about this artifact someday, maybe they'll even make a shader to emulate it. I wonder what causes it and why it disappears in game mode.
> on Linux it took about two years for 8K 60 Hz support to work, spawning a salty thread on GitHub
All I see is paying customers asking for support.
> The AMD on Linux fiasco is because the HDMI Forum has prohibited AMD from implementing HDMI 2.1 in their open source Linux drivers.
That's weird since nvidia's open source driver has an implementation.
There is nothing nostalgic about the checkerboard. It's simply a foot gun for those unfamiliar with using Samsung tvs as monitors.
I'm embarking on a similar geek journey. Just today I bought a used radiology PACS display (barco mdcc-6430) just to see if there is anything novel or cool about the picture or any clinical features. I'm not expecting much but stuff like this is how you find out.
This display is color, however I have considered getting a grayscale only rads display for "ADHD purposes" i.e. the same reason people are interested in e-ink displays (well, one reason).
It will probably be a huge waste of time and money but I'm just a masochist for tech pain I guess...
Nope.. I want to be able to properly split the screen in different inputs because of the lack of proper window / workspace management if you're not using separate monitors
I'd suggest getting a better window manager
I use a 4k TV. I've wanted upgrade to 8k for a while, but according to this post AMD on Linux can't do 8k so I guess I'm sticking with my current setup.
My 780M already struggles running GNOME at 4k, so maybe that's for the best.
I am using a pair of 43", 4k TVs. The 2x43" configuration has been my working setup for about a decade. I love it.
Anything bigger than 43" diagonal causes neck pain because it is too tall.
I understand three 32" TVs also work.
Seems pretty wide. no problem there?
I use a single curved 57" 32:9 DUHD monitor (Samsung Odyssey NEO G95NC) for work and gaming. Previously I used 3 24" monitors, but I like this setup a lot more.
I split it into 3 sections (browser for docs, and rest terminal/nvim), but i can easily change this if I want to show slack for example. For gaming I go fullscreen (and use overlays for stuff like VOIP or browsing) because it is a lot more immersive.
I used a 32" non-curved 4k monitor for a few months once. At some point I realized that I was moving my head around a lot as the corners were at an awkward place. On 28" I don't have this.
So anything above 30-ish inches I would consider either curved (expensive for hidpi resolutions) or two/three 27" screens angled a bit.
I can't imagine how bad it would be on a 65" flat screen.
30” flat screen at normal desktop viewing distance seems to be my personal limit, too.
How close are you to the screen? With my face about 1 foot away, I can easily scan all corners on the 32" 4k flat screen just by moving my eyes.
> TLDR: If your job is to write code all day [...], buy an 8K TV instead of a multi-monitor setup.
Counterpoints:
• All my keyboard muscle memory is setup for multi-monitor setups. Theoretically fixable with the right tiling window manager... which I would presumably have to install, since I do too much Windows stuff to go full time Linux. Or perhaps develop. Buying more monitors is a better use of my time.
• I curve my monitors inwards, intentionally, for better viewing angles. Also lets me hide a tower in one of the corners behind the curve on a straighter desk.
• I do too much multi-machine development (e.g. testing refactoring of multi-platform abstractions.) HDMI switches are super convenient, your TV's picture-in-picture functionality... may or may not be. Dual Windows PCs for testing on nVidia and AMD simultaniously, or remaining unblocked when busy reformatting/reinstalling/compiling/linking/syncing 100GB+ on one? Yes please. It's often interactive enough to want to keep open, yet passive enough to need something else to do. OS X for iOS and Linux for debugging server code? Sure. iOS and Android? Well... those have their own monitors. Consoles don't though, and I've targeted those too..
For an entertainment setup, I can usually scrape by with 2 or 3 monitors (1 landscape for fullscreen game, others typically portrait for chat/wiki/etc). Right now, I'm on a 75" 4K chonker. I have good eyes, but 8K would be a waste of pixels, and I'm already close enough that the viewing angles are noticable. Yet, I still hauled out a second monitor: an old 2.5K to exile junk I want to monitor off the main screen.
For a development setup, I've bought or brought a 4 x 27" 4K setup if one isn't provided. A 5th monitor has occasionally been useful (1 landscape for console, 4 portrait for console IDE, devtools, devtools IDE, and docs/wiki/jira/chat/notes. Replacing the 4x portrait with 2x 8K landscape... would probably work, at least, although I'm not convinced it'd feel like much of an upgrade, if any.)
> Having seven evenly-spaced columns would be impossible on a dual 4K display setup due to bezels in the middle.
I know I'm getting into old man yells at cloud territory here, but nobody needs this. Code on a 1024×600 netbook display, it will build character.
In the mid 90s professional video game programmers used typically a 1920x1080 display, just to have a larger code canvas and display sharper text.
From the 90s on 1600x1200, 1920x1080, 2048x1536 were resolutions one could find on professional displays.
From the 2010s on resolutions increased tremendously and 3840x2160 became the norm for consumer and professional displays.
When working with code you essentially work with text. You just want a big canvas and crisp text, thus high resolution.
I guess. I think the important thing is getting the program in your head, not on the screen. If the code is too complicated to hold it all in your mind then more columns of crisp text will not save you.
Like Joey Hess: https://usesthis.com/interviews/joey.hess/
I actually kind of agree with this. For me, the more pixels the better (I'm sensitive to fuzzy text, and subpixel rendering makes it worse), but I'd really prefer just one monitor, not too big. 15-19" is fine, especially if it's 4:3. 1600x1200 on a 17" monitor would be really nice.
Either SICP or PAIP, but these with cwm, uxterm and an editor it's mind-changing.
TL;DR: You can't really replace a monitor wall with a single screen because it does not curve to create the right viewing angle, which makes text seriously unreadable at the edges, which forces you to seriously upscale the font size, which steals the largest amount of real estate possible. Of all the compromises to make, reducing the number of screens is one of the worst ones.
4k screens are already somewhat questionable for productivity for this reason alone. The only serious argument that can be made is 1440p vs 1080p (personally I would argue for 1080p, if using bitmap fonts and having perfect eyesight). A 4k monitor wall is a rather fringe setup, that only works out to an advantage for day traders and weird surveillance applications. And it requires that you constantly do very energetic body gymnastics to change your perspective's location and be able to see all the details. With a single 8k screen without upscaling font size (hence preserving all technical real-estate), the body gymnastics required would be so much worse than a 4k wall, it would be absolutely ridiculous and clown-alike and almost impossible to use while typing. Otherwise people mainly want big 4k/8k screens for dual use as a TV set. But this is just wrong in itself, it creates a paradox for no good reason, like using screwdrivers as chisels. Some things are not meant to be. The only arrangement where 4k makes some sense for common use cases, is maybe above a curved ultra widescreen.
A sure sign that someone doesn't know what they're doing is if they use 3 or more monitors for programming.
And also interested in their neck related issues 10+ years later.
One screen for IDE (center)
One screen for documentation/browser
One screen for running the application (being developed).
Please, go ahead a explain to me how I don’t know what I’m doing.
I normally work with a 40", I'm using a a hammerspoon to divide the screen, but normally I end using one main window, with some smaller window at the side and cmd-tabbing between info. How do you manage the distraction of so many information at the same time? Do you switch between apps? use the mouse? don't you loose track of where the focused window is?
There are always good exceptions. But it's a rare sight.
I like to think Jeff Atwood has some idea of what he's doing.
https://blog.codinghorror.com/three-monitors-for-every-user/
People and their need for a "leader". No matter the quality. We had enough "truth tellers" and "follow me men" kinda shills.
Time to realize that not everyone is your friend in the internet. They feed you bullshit all the time and laugh how gullible people are and question nothing, just follow based on perceived merits of an individual.
A sure sign that someone doesn't know what they're doing is if they can judge someone's competence by how many monitors they use for programming.
I've had two monitors since the mid-2000s, and only recently gave them up for one 48". I haven't had neck problems yet in any way.
Huh? One screen for email/slack/.. main screen for the ide, other screen for logs etc. a lot less context switch to glance left/right than to go to another virtual desktop