Most video wall procurements are decided by a demo. A vendor shows a polished wall, the committee is impressed, the contract is signed — and the operational reality eighteen months later is a different product than the demo implied. This is a playbook for running a bake-off that predicts the production outcome instead of the demo outcome: how to build the shortlist, structure the evaluation, weight the criteria, and avoid the mistakes that wreck procurements.
Before the bake-off — the requirements document
A bake-off without a written requirements document is a beauty contest. Before any vendor contact, document four things:
- The compliance frame. Which regulatory requirements apply — see the compliance map. This is the first filter; it decides the vendor pool before any technical evaluation.
- The source inventory. Every feed the wall must carry today and the projected count at year three. Type, resolution, transport.
- The operator model. How many operator seats, what the workflow looks like, whether multi-operator control is needed.
- The five-year budget envelope. Including refresh and support, not just the year-zero purchase. See the TCO breakdown for the model.
Building the shortlist
Apply the compliance filter first — it is binary and it removes vendors fast. A Минцифры-registry requirement removes every non-RU vendor. A FedRAMP requirement removes vendors without the authorisation. Whatever remains is the eligible pool.
From the eligible pool, shortlist three to five vendors that span the architectural range — at least one hardware-controller option and at least one software-defined option, unless the requirements document already rules one out. The eight-platform comparison and the individual /vs/ pages are the starting reference for who fits where. A shortlist of more than five is unmanageable; fewer than three risks missing the right architecture.
The six weighted criteria
Score each shortlisted vendor on six criteria, weighted by how often each decides the production outcome:
- Operational fit — 35%. Does the software match the real operator workflow, source mix, and IT operations model? The largest single factor, and the one demos hide.
- 5-year TCO — 25%. The full envelope, including refresh and support, not the sticker price.
- Vendor longevity and support — 15%. EOL track record, patch cadence, escalation paths.
- Source-mix breadth — 10%. Native support for the inventory from the requirements document, plus headroom for the year-three projection.
- Reference depth — 10%. Verifiable deployments in the buyer's industry, region, and scale band.
- Architecture flexibility — 5%. On-prem / cloud / hybrid posture, lock-in exposure.
The weighting matters more than the exact numbers. The point is that "the GUI looks nice" is not on the list — it has never decided a production outcome and it dominates demos precisely because it is easy to show.
Structuring the bake-off
Three stages, in order:
- Document review. Each shortlisted vendor responds to the requirements document in writing. This surfaces the compliance and source-mix gaps before anyone spends time on a demo.
- Structured demo. Not the vendor's standard demo — your demo. Hand each vendor the same scenario built from your real source inventory and operator workflow, and have them run it. A vendor that can only show their canned demo has told you something.
- Proof of concept. For the top one or two, a time-boxed PoC on real infrastructure with real sources. This is where operational fit gets verified — the 35% criterion cannot be scored from a demo.
The evaluation mistakes that wreck procurements
- Scoring the demo instead of the deployment. A demo runs on the vendor's hardware with the vendor's content. It predicts almost nothing about production. Insist on the PoC.
- Treating "software" as a single category. A perpetual-licence stack and a per-display subscription stack have wildly different 5-year TCO even though both are "software". Score the pricing model, not the label.
- Skipping the IT operations question. The single filter that decides most bake-offs is what the IT team can actually maintain. A site with no Linux operations capacity should not score a software-defined stack highly on operational fit, however good the product.
- Letting the committee score features they cannot weight. A feature list with 200 rows is noise. Score the six criteria; everything else is detail that rolls up into them.
- Ignoring the year-three projection. A wall sized for today's source count and operator seats, evaluated only against today, is a procurement that needs redoing in three years.
The decision matrix
The output of the bake-off is a single matrix: shortlisted vendors as rows, the six weighted criteria as columns, a score in each cell, a weighted total per vendor. The matrix is not the decision — it is the structured input to the decision. If the weighted total and the committee's instinct disagree, that disagreement is the most useful output of the whole process: it means a criterion is mis-weighted, or the instinct is reacting to something not captured. Resolve it explicitly rather than overriding the matrix quietly.
Where Craft Wall fits in a bake-off
Craft Wall scores well on the bake-off criteria where its architecture is the match — 5-year TCO, source-mix breadth, architecture flexibility, operational fit for a site with Linux operations capacity. It scores lower where the requirements document calls for a Tier 1 brand with a 15-20 year support horizon, Минцифры- registry sourcing, or sub-frame operator KVM latency.
The honest framing for a procurement team: Craft Wall is built to win the bake-offs where software-defined architecture is the right answer, and to lose cleanly the ones where it is not. The /vs/ comparison pages document exactly where each line falls against each major competitor — use them as the per-criterion reference when building the matrix.
Closing
A video wall bake-off run on a requirements document, scored on six weighted criteria, and verified with a real-infrastructure PoC predicts the production outcome. A bake-off run on demos predicts the demo. The extra effort of the structured process is small against the cost of discovering the wrong choice eighteen months into a five-year deployment.
Read next: the eight-platform comparison for the shortlist starting point, the TCO breakdown for the 5-year-cost criterion, and the interactive TCO calculator to score your own numbers.
Related reading
- Best video wall software in 2026: eight platforms compared honestly
- Software-defined vs hardware video wall controllers: a 5-year TCO breakdown
- Video wall compliance: the regulatory map for control-room procurement
- Migrating from a hardware video wall controller to a software-defined stack
- Craft Wall vs Userful Infinity Platform
- Craft Wall vs Datapath WallControl