As I said, the VLIW approach loses because it needs "magic compilers" or hand-tuning in assembler language to approach the best performance of conventional superscalar CPUs. Your prediction of AI-based compiler technology to solve the magic compiler problem is not yet on the horizon, so VLIW under-performs superscalar designs because superscalar instruction-level parallelism is automatic and hidden from the compilers and applications. Though VLIW may very well have the theoretical potential to exceed current superscalar strategies, the lack of required compiler technologies supports the continued massive industry investments in superscalar designs. I believe when choosing technology winners you can always follow the money; you can choose technology winners by the industry R&D dollars being invested, and not theoretical technical superiority. Examples that come to mind of dominant technologies that are in their market position because of massive funding, yet are technologically inferior, arguably include the x86 ISA, PCI Express, Ethernet, SCSI/SATA (now being quickly superseded by NVMe, but they were dominant for a long time while being inferior), and I would argue DDR. VLIW investment is currently trivial by comparison to superscalar R&D. Further evidence for my pessimism is the lack of successful products in the market after decades of people touting the VLIW approach, and that one of the biggest and most expensive failures in computer engineering history, and a multi-company disaster to boot, was a VLIW CPU, the Intel Itanium.
I can only think of two computing technologies, in a few minutes of thinking about it, that rose up from a superior technological base and had lower funding than the incumbent technologies, and achieved dominance. Relational databases (1970s/1980s) and SSDs. Relational databases, and their associated technology, Structured Query Language (SQL), did a clean sweep of the competing database technologies within a short number of years, and still dominate the database industry. Of course, the relational model and SQL were invented by IBM, which at the time was the world's largest computing R&D investor, but they were half-hearted about it because they has a massive revenue stream from their own non-relational database, IMS-DB. I think we all know the SDD versus HDD story. Can you think of any others that violate the R&D investment rule?