Installation and configuration
This document explains how to launch the AI Product Support module in a Magento 2 store and how to prepare it for use.
The document is written for a store that wants to implement a ready-made module and start using AI chat without going into the technical details of how it works on the code side.
It is best to treat this module as a user-facing layer. First, the store prepares the knowledge, and only then makes it available to employees or customers as a chat.
What to prepare before installation
Before implementation, you need:
- a working Magento 2 store,
- server access,
- Composer,
- an OpenAI account,
- a prepared AI knowledge base for the store,
- server internet connectivity.
The most important practical requirement is simple: the module will provide good answers only when the store has prepared content that AI can use.
If the store already uses the Kowal AI Product Feed module, it can serve as the tool for preparing and organizing content for AI. On that module page, this is described as building an organized and up-to-date knowledge base for AI systems. AI Product Support is the natural next step, meaning the use of that knowledge in a conversation with the user. Source: Kowal AI Product Feed for OpenAI Vector Store
Module installation
The module is installed via composer.
Example installation process:
composer config repositories.ai.product.support vcs https://github.com/kowalco/ai-product-supportcomposer config --global --auth github-oauth.github.com composer require kowal/module-ai-product-supportbin/magento module:enable Kowal_AiProductSupportbin/magento setup:upgradebin/magento cache:clean
If the store runs in production mode, also execute after installation:
bin/magento setup:di:compilebin/magento setup:static-content:deploy -fbin/magento cache:clean
Where the configuration is located
After installation, you can find the module settings here:
Stores > Configuration > Kowal AI > AI Product Support
How to configure the module
1. Enable chat in the admin panel
In the General section, enable:
After saving the setting, the AI tab will appear in the admin panel.
2. Decide whether to enable chat on the frontend
If you want store customers to be able to use the chat as well, enable:
If the module is intended only for the store team, leave the frontend disabled.
3. Set the question length
The field:
defines the maximum length of a message the user can send. This helps keep things organized and limit overly long, unreadable requests.
4. Set the default store view
If the store operates in multiple language versions or has several store views, you can specify:
This makes work in the panel easier and helps start from the correct context.
5. Enable or disable manual store view switching
The field:
determines whether the panel user can change the store view directly in the chat popup.
This is useful when one team supports several store versions.
6. Enable technical logging during implementation
The field:
is worth enabling during configuration and testing. This makes it easier to check whether the module works correctly. After the production rollout, you can decide whether logging should remain active.
AI connection settings
In the OpenAI section, configure the basic elements required for the module to work.
OpenAI API Key
This is the access key to the AI service. Without it, the module will not retrieve the list of models, read the knowledge source, or send a question.
Response Model
This is the model responsible for building answers.
The model list is loaded from the API. If no options appear after saving the key, you can use the button to refresh the model list.
Vector Store
This is the selected knowledge source for the module.
The simplest way to understand it is this:
- it is the place where the content used by the chat when answering is stored,
- the module looks for answers there,
- if you choose the wrong source, the answers will be weak or incomplete.
If the store already has an organized knowledge base prepared for AI, this is where you select that source.
The list of knowledge sources can also be refreshed from the configuration level.
Maximum File Search Results
This setting defines how many supporting materials the module takes into account when preparing one answer.
In practice, it affects:
- answer quality,
- performance speed,
- the cost of using AI.
A good starting setting is a medium value, for example 6.
Frontend security
If the chat runs on the store side, it is worth setting security limits right away.
In the Frontend Security section, you will find:
Requests Per MinuteRequests Per HourMinimum Submit Delay
These settings help limit:
- questions being sent too frequently,
- abuse from bots,
- unnecessary resource and cost usage.
Additional system prompt
In the Prompting section, you can fill in:
This field is not required. In most implementations, you can leave it empty and use the module default setting.
Overriding it makes sense only when the store wants to introduce its own answer style or additional communication rules.
How the module works from the user perspective
In the admin panel
After the module is enabled, the user sees an AI tab at the right edge of the screen. Clicking it opens the chat panel.
In the panel, you can:
- enter a question,
- receive an answer,
- see product cards if the answer concerns a specific product,
- work without reloading the page.
On the frontend
If the chat has been enabled, a similar tab appears on the store side. The customer can ask a question without leaving the product page or listing.
How to think about this type of implementation
The most practical model looks like this:
- the store organizes content and prepares the knowledge base,
- the administrator selects that knowledge base in the module configuration,
- the user uses the chat,
- answers are built based on content prepared by the store.
This approach gives a better result than launching the chat alone without prepared data.
Recommended starting configuration
Admin panel
Enable Chat = YesEnable Frontend Chat = depending on the projectMaximum Question Length = 1000Allow Store Switcher = Yes for multiple store viewsLog Chat Requests = Yes during testingMaximum File Search Results = 6
Frontend
Requests Per Minute = conservative starting valueRequests Per Hour = value adjusted to store trafficMinimum Submit Delay = at least 1
What to check after implementation
After saving the configuration, perform a simple test:
- check whether the AI tab appeared in the admin panel,
- open the popup and send a question,
- make sure the answer appears correctly,
- check whether the product section appears for questions about specific products,
- if the frontend is enabled, check how the tab works in the store as well.
Most common issues
Models or knowledge sources are not visible
This most often means:
- an incorrect OpenAI key,
- no internet connection from the server,
- cache not cleared after changes.
The chat answers too poorly or misses the topic
The most common cause is not the module itself, but the quality of the prepared knowledge base. If the content is incomplete, outdated, or too limited, the answers will also be weaker.
Before evaluating the module itself, it is worth checking:
- whether product descriptions are meaningful and complete,
- whether the FAQ and documentation are up to date,
- whether the knowledge base actually contains content users need,
- whether the selected knowledge source is appropriate for the given store.
The frontend blocks the user too quickly
In that case, check the limit and submission delay settings in the Frontend Security section.
Short implementation checklist
- Install the module via
composer. - Enable the module and run
setup:upgrade. - Configure the OpenAI connection.
- Select the correct knowledge source.
- Enable chat in the admin panel.
- Optionally enable chat on the frontend.
- Set security limits.
- Test the behavior using questions about real products and store content.
Short description for the Installation and configuration section
The module is installed via composer and configured in Stores > Configuration > Kowal AI > AI Product Support. The administrator enters OpenAI access data, selects the model and the knowledge source for the store, and then enables the chat in the admin panel and optionally on the frontend. In addition, security limits can be set and the basic operating parameters of the module can be adjusted.