How We Built a High-Performance Smart Home Knowledgebase
Our full stack, workflow, and SEO strategy.
Most AV and smart home companies have the same problem when it comes to their websites: they’re slow, poorly structured, and ultimately ineffective. Content is often thin or disconnected, pages exist without a clear purpose, and the overall experience feels more like a static brochure than a working system.
We wanted something completely different.
Rather than building a website that simply “looks good,” we set out to build something that performs — both technically and structurally. The goal was to create a platform that is fast, scalable, and designed for long-term visibility, while also being genuinely useful to clients trying to understand complex systems like WiFi, AV, and smart home integration.
Key takeaways
- We structured the site as a knowledgebase system, not isolated pages.
- We chose a static documentation framework (Docusaurus) for predictable performance and maintainability.
- Internal linking is treated as architecture—it’s designed in from day one.
- We optimise not just for rankings, but for retrieval and understanding (search engines and AI).
- We keep analytics and tooling simple, decision-ready, and privacy-first.
Introduction
Most websites in the smart home / AV space fail in the same ways:
- They’re slow.
- They’re hard to navigate.
- They don’t explain complex systems clearly.
- They don’t build authority over time.
From an SEO perspective, this leads to weak visibility. From a client perspective, it creates uncertainty.
We approached the project as a system design exercise: define the structure first, then choose tools that support it.
A knowledgebase, not a brochure
The biggest shift in this project was philosophical.
Instead of approaching the site as a collection of pages, we treated it as an interconnected knowledge system. That meant thinking about how information is organised, how topics connect, and how both users and machines navigate the content.
A structured content model
At the core is a structured content model. Services, case studies, knowledgebase articles, and location pages exist as part of a wider framework rather than as isolated pieces. Each page supports others, and every piece of content is written with a clear role in mind.
This structure allows the site to grow in a controlled way. As new articles are added, they don’t sit in isolation — they strengthen existing pages through internal linking and shared context. Over time, this creates a more robust and interconnected platform.
Internal linking as architecture (not an afterthought)
Internal linking plays a critical role. Rather than being added at the end of the process, it is designed into the content from the start.
Articles link to relevant services, case studies reinforce technical explanations, and location pages provide context for local relevance. This helps users navigate naturally, but it also helps search engines and AI systems understand how everything fits together.
Traditional SEO is largely about ranking individual pages. Our approach is slightly different: build a system that can be clearly interpreted, understood, and retrieved — whether that’s by Google, or by an AI assistant generating an answer for a user.
Designed to be understood (search and AI)
As search changes, structure matters more. Content needs to be:
- clearly segmented into sections
- written to answer specific questions
- connected through internal links
- consistent in terminology and intent
The goal is not just to “rank,” but to be understood and reliably referenced.
Building the platform: tools that support the system
The tools we use were chosen to support this structure, not define it. In several cases, we avoided common options in favour of solutions that offered more control and predictability.
Keeping the dev environment simple
We keep the development environment intentionally simple:
- Visual Studio Code
- Node.js + NPM for builds and dependencies
- Git-based content management and versioning
This allows rapid iteration and full visibility into how the site behaves without the overhead of a traditional CMS.
Why Docusaurus
At the core of the platform is Docusaurus. It’s a static site framework usually used for documentation, which made it a natural fit for what we were building.
Compared to a plugin-heavy CMS, Docusaurus provides:
- markdown-based content with predictable behaviour
- clean routing and explicit page relationships
- strong baseline performance (static output, fast loads)
- versionable content and controlled changes over time
It lets the website behave more like a well-organised system than a collection of loosely connected pages.
Our AI workflow: orchestrating systems, not replacing thinking
AI played a major role in building this platform, but not in the way most people expect.
Rather than relying on a single tool to generate content, we use a combination of systems—each chosen for a specific strength. The value comes from orchestration: using the right tool for the right stage of the workflow.
TrySixth as the system layer
TrySixth acts as the primary system layer. It helps navigate the project structure, maintain consistency across files, and manage larger changes without losing context. For a project where structure and continuity are critical, this is extremely valuable.
ChatGPT for structured problem-solving
We use ChatGPT for structured thinking and problem-solving: breaking down complex topics, refining workflows, and keeping outputs logical and consistent.
Claude for long-form audits and clarity
Claude is used in a complementary way, primarily for long-form reasoning and audits. It’s especially useful when reviewing larger sections of content, identifying gaps, and improving clarity.
The important point is that AI is not being used to “write content quickly.” It supports a structured process. The final result is still shaped through iteration and refinement.
Analytics: clarity over complexity
Understanding how the site performs is essential, but the way data is presented matters just as much as the data itself.
Rather than relying on complex analytics platforms filled with unnecessary detail, we chose a more focused approach.
We use Rybbit because it provides a clean, accessible view of how users interact with the site—without turning analytics into its own full-time job.
This makes it easier to understand key behaviours:
- which pages are performing well
- how users move through the site
- where drop-offs happen
- where improvements are needed
Another important factor was privacy. Rybbit takes a modern, privacy-first approach, which aligns with how we believe websites should operate.
The goal isn’t to collect as much information as possible. It’s to collect the right information and present it in a way that supports clear decision-making.
SEO and the shift toward AI visibility
Search is changing, and any modern website needs to account for that.
We use SEMrush as our primary SEO platform. It supports traditional SEO work like:
- keyword research
- technical audits
- competitor analysis
It also helps identify and resolve issues like redirect conflicts, canonical inconsistencies, and sitemap errors—details that can have an outsized impact if ignored.
Writing for retrieval (not just ranking)
Our approach goes beyond traditional SEO.
There is a clear shift towards AI-driven search, where users increasingly rely on assistants and large language models to retrieve information rather than clicking through results. That changes how content needs to be structured.
This influences how we write and organise content:
- answer specific questions clearly
- use predictable heading structures (H2/H3)
- reinforce relationships with internal links
- keep terminology consistent (entities and concepts)
In simple terms: the site is not just designed to rank. It is designed to be understood.
Infrastructure and technical control
Performance and predictability are critical—especially when SEO is involved.
We run a controlled hosting environment using Nginx, which allows us to manage routing, redirects, and headers with precision. This helps maintain consistency across the site and avoids common issues that can quietly degrade search performance over time.
Redirects, trailing slashes, and duplicate URLs
One of the more complex challenges involved aligning trailing slash behaviour across the system. Differences between server configuration and framework output can lead to:
- redirect loops
- duplicate pages
- inconsistent canonicals
Resolving this required careful coordination between the server, the build configuration, and internal linking patterns.
Canonicals and sitemaps (small details, big impact)
We also addressed canonical and sitemap behaviour so only the correct versions of pages are indexed and the overall structure stays clean.
These details are small on the surface, but they strongly influence how search engines interpret a site—especially as it scales.
Challenges and iteration
This project was not built in a single pass.
Along the way, we encountered and resolved technical challenges—from redirect loops and URL inconsistencies to SEO-tool warnings that required careful interpretation.
For example, large numbers of internal anchor links can appear as “errors” in some audits, even though they’re a natural part of structured documentation. Knowing which issues are genuinely important—and which are simply noise—is part of building a reliable system.
Each iteration refined the platform. The result is more stable, more predictable, and better aligned with how users (and machines) actually navigate content.
What this means for our clients
Although this article focuses on how the website was built, the approach maps directly to how we design systems for clients.
Whether it’s a WiFi network, an AV setup, or a full smart home integration, the same principles apply:
- structure before complexity
- compatibility before features
- long-term performance over short-term fixes
This means:
- systems that work together properly
- designs that scale over time
- installations that are reliable and easy to manage
Just like this site, the goal is not complexity. It is clarity, structure, and performance.
Glossary
Docusaurus
A static site framework designed for documentation. It turns markdown content into a fast, structured website. Learn more at https://docusaurus.io/.
Static site
A site that’s generated ahead of time (build step) and served as files (HTML/CSS/JS), rather than being assembled dynamically on every request. This is a major reason documentation-style sites can be extremely fast and stable.
Internal linking
Links between pages on the same site. Done well, internal linking helps users navigate and helps search engines (and AI systems) understand relationships between topics.
Canonical URL
A signal that tells search engines which URL should be treated as the “main” version of a page when multiple variations exist (for example, with/without trailing slashes).
Trailing slash
A URL formatting detail (e.g., /topic/ vs /topic). If not managed consistently, it can create duplicate URLs and redirects that harm performance and SEO clarity.
Robots.txt
A file that helps guide crawler behaviour (what to crawl, what not to crawl). It’s not a security control, but it can prevent wasted crawl budget and accidental indexing of irrelevant areas.
Frequently asked questions
Why not WordPress?
For a content-heavy site where structure, versioning, and predictable performance matter, a static documentation framework can be easier to maintain than a plugin-heavy CMS. The goal was control and clarity, not convenience at the cost of complexity.
Does a knowledgebase actually help clients?
Yes—if it’s structured. A knowledgebase reduces uncertainty by explaining concepts clearly, showing how systems fit together, and helping clients self-educate before a conversation. It also compounds SEO value over time through internal linking.
How do you balance SEO with performance?
By treating performance as a core requirement (not an optimisation pass). Static output, clean routing, consistent canonicals, and disciplined redirects reduce technical debt and protect long-term search visibility.
How do you structure content for AI search?
Write to be retrieved: clear headings, direct answers, consistent terminology, and internal links that reinforce relationships. The goal is to make the content easy to interpret—whether by a search engine or an AI system.
Final thoughts
This platform is designed to evolve.
As new content is added, the system becomes stronger. Each article, case study, and service page contributes to a wider structure that continues to grow over time.
The tools will evolve, and the way search works will continue to change—but the underlying principles remain the same:
Clear structure. Intentional design. Long-term thinking.
We didn’t set out to build a website.
We built a system—one that is designed to perform, scale, and be understood.
