I don't really see the engineering challenge with 4K. You can probably more or less make a panel that's arbitrary size (before you run into signal noise limitations due to wiring I guess), so 8K, 16K, whateverK and aspect ratio.
The problem with high resolution displays is the extreme lack of source content in high resolutions and framerate. Even console shits are coded to "max out" the shitty console capability at about 60FPS 1080p nowadays, I think that's what the target is anyway. They'll need new consoles or an option to decrease shaD3rZ/LOD when people start asking for 4K.
Literally the only place where extremely high resolutions are interesting is in the PC graphics (gaming, demos, medical, CAD, presentations, etc.) arena... and that's extremely niche.
The "general public" that's watching cable on their living room low-powered boxes are barely catching onto 1920x1080 at 24(/30?) FPS content, so there's extremely little interest to spend marginally more money for absolutely zero increase in quality.
That being said, on PC high resolutions aren't anything new either, I mean people have been running arbitrary resolutions since the mid-2000's. The problem that's annoying there is bezels between displays or extremely high cost (6, 12, 24 projectors, for example). The real benefit behind a "standard" is that hopefully they can mass-produce things using that "standard" (monitors and/or projectors in this case) so economies of scale kick in. However, if you want any resolution, you can pretty much have it.
I think the oculus riftjob will change a lot of things if they can squeeze some really high resolution in the displays there in a few years. Think content (not just PC/games but also movies/sports) where you can place yourself anywhere on the field and look anywhere. It's not that far off, I give it 1-2 years before something like that is implemented.