Meta description: Django and React i18n often drift apart. Use one workflow to keep .po files and React JSON translations synced and deployable.
You already know the failure mode.
makemessages runs cleanly. Your Django templates and gettext_lazy() strings are covered. Translators work in .po files. compilemessages finishes and the server side UI looks right.
Then the React frontend lands, and your translation workflow splits in half.
Now you have Django .po files on one side, React JSON catalogs on the other, and a team copy-pasting the same strings between two formats. Someone updates a label in Python, nobody updates the React key, and a week later your checkout page shows English in one component and German everywhere else. This is the primary issue with internationalization in react on a Django stack. It’s not the library setup. It’s the second source of truth.
I’ve seen teams make this worse by treating frontend i18n as a separate system with separate translators, separate review, and separate naming rules. You get duplicate work, inconsistent terminology, and Git diffs nobody trusts.
A better workflow starts from a blunt rule. Django owns the source strings. React consumes derived artifacts.
The Disconnect Between Django i18n and React
Django developers usually hit the wall the same way. Backend pages already use LocaleMiddleware, templates are marked, and the locale tree is stable:
python manage.py makemessages -l de
python manage.py compilemessages
Then the frontend team adds react-i18next or react-intl, asks for JSON files, and the old workflow stops being enough. Django produces this:
#: billing/views.py:18
msgid "Welcome back, %(name)s"
msgstr "Willkommen zurück, %(name)s"
React wants something closer to this:
{
"welcome_back": "Willkommen zurück, {{name}}"
}
Those formats are close enough to look compatible, and different enough to break in production.
Where teams usually go wrong
Most bad setups have one of these traits:
- Two sources of truth: backend strings live in
.po, frontend strings live in JSON. - Manual syncing: someone exports, reformats, and commits translation files by hand.
- Placeholder drift: Django uses
%(name)s, React uses{{name}}, and one side forgets to convert. - Key chaos: component authors invent keys ad hoc, so the same term appears under multiple IDs.
Practical rule: if your translators or developers have to touch both
.poand JSON by hand, your process is already off course.
That disconnect is why so many React i18n guides feel incomplete for Django teams. They explain hooks and providers, but skip the part where your stack already has a mature i18n system. If you want a frontend-only view of the JavaScript side, the guide on how translation works in JavaScript apps is a useful companion. For a Django shop, though, the main benefit is joining both worlds instead of running them in parallel.
Choosing Your React Internationalization Library
Your first bad library choice usually shows up during the first translation sync, not the first demo. Django exports strings cleanly into .po files, the React side wants JSON, and suddenly the frontend library decides how much conversion logic, placeholder rewriting, and runtime formatting you now own.
For a Django and React stack, I would start by deciding one thing: is React consuming translations that Django already owns, or is React authoring its own messages? That answer narrows the field fast.

What I’d pick for a Django-backed app
react-i18next is the default I recommend when Django remains the source of truth and React receives generated JSON catalogs. Its API is simple, it accepts plain JSON without forcing a compile pipeline, and its package activity is easy to verify on npm for react-i18next. That matters in a production stack because fewer moving parts means fewer chances to break locale updates during deploys.
react-intl works well for teams that already standardize on ICU message syntax and want formatting rules close to the message definitions. That package also has healthy adoption, which you can check directly on npm for react-intl. I use it when product copy is full of plural rules, currencies, dates, and locale-sensitive phrasing that benefits from ICU-first authoring.
Lingui is a better fit when the frontend owns extraction and compilation. In a Django-first setup, that usually creates friction. You end up maintaining one workflow for Python strings and another for React strings, which is exactly the split this article is trying to avoid.
If your team is also planning regional variants inside the same language, such as en-US versus en-GB or US market specific phrasing, the best strategies for localizing your React app are worth reviewing before you lock in key structure and locale naming.
React i18n Library Comparison
| Feature | react-i18next | react-intl | Lingui |
|---|---|---|---|
| Best fit | Django + React with JSON catalogs | Teams committed to ICU message format | Frontend-led extraction workflows |
| Translation format | JSON, flexible key structure | Message catalogs, ICU-oriented | PO/catalog workflow, compile step |
| Django integration | Strong, easy to feed from converted .po data |
Good, but message syntax decisions matter more | Possible, but less natural for .po as upstream source |
| Formatting support | Dates, numbers, currencies, plurals via Intl API | Locale-aware formatting is a core strength | Good support, usually through compiled messages |
| Ecosystem signal | Large adoption footprint | Mature and widely used | Smaller ecosystem |
| My default recommendation | Yes | Only if your team wants ICU-first patterns | Only if React owns localization end to end |
Trade-offs that actually matter
The main trade-off is not developer ergonomics in a tutorial. It is operational consistency.
react-i18next is usually the least painful option when your deployment process already depends on Django extracting, reviewing, and compiling translations. Converting .po into JSON is straightforward. Mapping placeholder syntax is still work, but it is predictable work, and you can automate it.
react-intl can be the better choice if message formatting complexity is the primary problem in your app. I would make that choice deliberately, because ICU syntax becomes part of your translation contract. That is fine if translators, backend developers, and frontend developers all agree on it. It is a headache if Django still produces one style of placeholder and React expects another.
Lingui is good software. I just would not put it in front of a Django-owned localization process unless there is a clear reason to let the frontend drive extraction.
My rule is simple: choose the library that matches your upstream translation workflow, not the one with the nicest component examples. In a Django plus React app, that usually means react-i18next.
Setting Up a Scalable i18n Architecture in React
A React app gets messy fast if translations live next to random components. The clean pattern is an i18n directory with three parts: a translations folder, a config file, and a centralized key file, as described in this scalable React i18n architecture guide.

Use a predictable folder layout
Here’s a layout that scales without getting cute:
src/
i18n/
index.js
keys.js
translations/
base.json
en.json
de.json
components/
pages/
main.jsx
base.json is your template or reference catalog. en.json, de.json, and the rest are generated or maintained language files. keys.js gives developers one place to import stable identifiers.
// src/i18n/keys.js
export const I18N_KEYS = {
common: {
save: 'common.save',
cancel: 'common.cancel',
},
billing: {
welcomeBack: 'billing.welcome_back',
invoiceCount: 'billing.invoice_count',
},
};
Initialize react-i18next once
Use one config file and import it at app startup.
// src/i18n/index.js
import i18n from 'i18next';
import { initReactI18next } from 'react-i18next';
import en from './translations/en.json';
import de from './translations/de.json';
i18n
.use(initReactI18next)
.init({
resources: {
en: { translation: en },
de: { translation: de },
},
lng: 'en',
fallbackLng: 'en',
interpolation: {
escapeValue: false,
},
});
export default i18n;
Then import it once in your entrypoint:
// src/main.jsx
import React from 'react';
import ReactDOM from 'react-dom/client';
import App from './App';
import './i18n';
ReactDOM.createRoot(document.getElementById('root')).render(
<React.StrictMode>
<App />
</React.StrictMode>
);
A lot of teams skip the centralized key file and tell developers to pass raw strings to t(). That works at first. Later, refactors turn into grep sessions.
If you’re dealing with US variants like en-US and en-GB, regional terminology matters just as much as language support. The write-up on best strategies for localizing your React app is worth a read for market-specific naming and formatting choices.
What works and what doesn’t
- Works: one config file, one key map, one generated translations directory.
- Works: lazy-loading by namespace once your catalog grows.
- Doesn’t: storing ad hoc translation objects inside component files.
- Doesn’t: mixing human-readable keys and sentence-as-key patterns in the same app.
Practical i18n Patterns in Your Components
A common failure shows up after the first real translation pass. The React UI renders translated strings, but billing totals still use the wrong currency format, a German label blows up a button width, and an Arabic screen flips text without flipping layout. Component-level i18n is where those mistakes become visible.
The rule I keep is simple. Components should ask for messages and receive formatted values. They should not invent sentence structure, concatenate fragments, or hide fallback English in JSX. That matters even more in a Django plus React stack, because the strings usually started life in Django .po files and need to survive conversion into React-friendly JSON without losing placeholders or meaning. If you need to keep those placeholder rules consistent with your backend catalog, document them next to your conversion process. A short reference for working with PO files in a frontend translation pipeline helps keep that contract explicit.
Basic component usage
For day-to-day components, keep the rendering code boring and predictable:
import { useTranslation } from 'react-i18next';
import { I18N_KEYS } from '../i18n/keys';
export function BillingHeader({ name }) {
const { t } = useTranslation();
return (
<h1>{t(I18N_KEYS.billing.welcomeBack, { name })}</h1>
);
}
And the translation file:
{
"billing": {
"welcome_back": "Welcome back, {{name}}"
}
}
This pattern avoids a problem I see in mixed Django and React codebases all the time. A developer writes "Welcome back, " + name in React because it feels faster, while the backend already has a translated sentence with a placeholder. Now the frontend and backend disagree about word order, punctuation, and translator context. Keep the sentence whole.
Plurals and formatting
Plural logic belongs in the i18n library. English makes ternary-based plurals look harmless. Other languages expose the shortcut fast.
{
"billing": {
"invoice_count_one": "{{count}} invoice",
"invoice_count_other": "{{count}} invoices"
}
}
<p>{t(I18N_KEYS.billing.invoiceCount, { count })}</p>
Formatting is a separate concern. I prefer formatting numbers, dates, and currency before they hit the message unless the grammar depends on the raw value.
const amount = new Intl.NumberFormat(locale, {
style: 'currency',
currency: 'EUR',
}).format(total);
const createdAt = new Intl.DateTimeFormat(locale, {
dateStyle: 'medium',
timeStyle: 'short',
}).format(new Date(invoiceDate));
Then pass those formatted values into the translated string:
<p>
{t(I18N_KEYS.billing.invoiceSummary, {
amount,
createdAt,
})}
</p>
That split pays off during audits. Scattered toLocaleString() calls are hard to review, hard to test, and easy to make inconsistent with what Django is doing in emails, invoices, or server-rendered templates.
If you want a starter project to inspect for general React structure, lunabloomai's React Starter App is a useful reference for how teams package shared app concerns. Just don’t copy any starter blindly into an i18n-heavy app without deciding who owns message files.
RTL and placeholders
RTL support starts at the app shell. Translating text without setting document direction leaves drawers, spacing, icons, and alignment half broken.
import { useEffect } from 'react';
import { useTranslation } from 'react-i18next';
export function DirectionController() {
const { i18n } = useTranslation();
useEffect(() => {
const rtlLocales = ['ar', 'he'];
const isRtl = rtlLocales.some((code) => i18n.language.startsWith(code));
document.documentElement.dir = isRtl ? 'rtl' : 'ltr';
document.documentElement.lang = i18n.language;
}, [i18n.language]);
return null;
}
Also use CSS logical properties where possible. margin-inline-start ages better than hardcoded margin-left once the UI needs to flip.
A few rules keep component code from drifting away from the Django catalog:
- Pass values through interpolation: translators can reorder placeholders, and reviewers can compare React JSON to the original
.poentry without reading JSX logic. - Keep fallback text out of components: hidden English strings never make it back into the translation workflow.
- Use stable keys: renaming keys casually throws away translator history and makes
.poto JSON sync noisier than it needs to be. - Test ugly locales on purpose: German exposes width problems, Arabic exposes direction bugs, and Japanese exposes assumptions about spacing and line breaks.
The practical standard is straightforward. Components render translated messages, formatting helpers produce locale-aware values, and the message shape stays close to the Django source so the sync step stays mechanical instead of fragile.
Syncing Django PO Files with React JSON
Here’s the part most articles skip. Your Django .po files should stay upstream. React should consume generated JSON built from them.

A common project layout looks like this:
locale/
de_DE/
LC_MESSAGES/
django.po
fr_FR/
LC_MESSAGES/
django.po
frontend/
src/
i18n/
translations/
The hard part isn’t reading .po files. The hard part is preserving placeholders, HTML tags, and structured strings during conversion. That problem is called out directly in Robin Wieruch’s React internationalization article, and it’s the place where rushed automation usually breaks.
A Python conversion script that fits Django teams
Use polib. It’s boring and dependable.
# scripts/po_to_json.py
from pathlib import Path
import json
import re
import polib
BASE_DIR = Path(__file__).resolve().parent.parent
LOCALE_DIR = BASE_DIR / "locale"
OUTPUT_DIR = BASE_DIR / "frontend" / "src" / "i18n" / "translations"
def django_to_i18next_placeholders(text):
if not text:
return text
text = re.sub(r"%\((\w+)\)s", r"{{\1}}", text)
text = re.sub(r"%s", r"{{value}}", text)
text = re.sub(r"\{(\d+)\}", r"{{$\1}}", text)
return text
def po_to_dict(po_path):
po = polib.pofile(str(po_path))
data = {}
for entry in po:
if not entry.msgid or entry.obsolete:
continue
key = entry.msgid
value = entry.msgstr or entry.msgid
data[key] = django_to_i18next_placeholders(value)
return data
def main():
OUTPUT_DIR.mkdir(parents=True, exist_ok=True)
for locale_path in LOCALE_DIR.iterdir():
po_path = locale_path / "LC_MESSAGES" / "django.po"
if not po_path.exists():
continue
locale_code = locale_path.name.replace("_", "-")
data = po_to_dict(po_path)
output_path = OUTPUT_DIR / f"{locale_code}.json"
output_path.write_text(
json.dumps(data, ensure_ascii=False, indent=2) + "\n",
encoding="utf-8",
)
if __name__ == "__main__":
main()
Install the dependency:
python -m pip install polib
That script uses msgid as the React key. I prefer explicit symbolic keys for larger apps, but if your existing Django catalog uses full source strings as IDs, this gets you moving without a migration project.
Keep conversion deterministic. If the script reformats keys differently on every run, your translation diffs become noise.
Placeholder conversion is where things break
Django and React don’t speak the same interpolation dialect:
| Django form | React i18n target |
|---|---|
%(name)s |
{{name}} |
%s |
{{value}} |
{0} |
preserve deliberately and map consistently |
If you don’t have firm rules here, translators will see one placeholder style in .po, developers will expect another in React, and nobody will trust automated output. The docs for working with Django .po files in automation pipelines are worth reading for the operational side of that problem.
Here’s a quick walkthrough of the broader mechanics before you wire your own script into the repo:
The chain you actually want
Once you stop treating React as a separate translation system, the flow gets cleaner:
python manage.py makemessages -l de
python manage.py compilemessages
python scripts/po_to_json.py
If your team is translating .po files with a CLI-based automation tool before conversion, that fits fine here. The important part is order. Translate the .po files first. Generate React JSON second. Commit both.
Production Readiness and CI Automation
Friday deploy. Staging looked fine in English. Then a German user opens the billing screen and gets half translated UI, a raw %(name)s placeholder in the header, and a locale reset after refresh.
That kind of failure usually comes from the pipeline, not the translator.
In a Django and React stack, production i18n breaks in predictable places. Catalogs are loaded too eagerly, locale selection lives only in React state, generated JSON is stale, or CI never verifies that .po changes made it all the way into the frontend build. Fix those paths before adding another language.

Lazy-load catalogs and persist locale
If you use react-i18next, load only the locale and namespace the current route needs. Shipping every catalog in the initial bundle gets expensive fast, especially once product copy spreads across settings, billing, onboarding, and email-related screens. Keep the default path small and fetch the rest on demand.
Persist the selected locale outside component state. A user who chooses de should still be in de after a hard refresh, a new tab, or a session restore.
export function saveLocale(locale) {
localStorage.setItem('app.locale', locale);
}
export function loadLocale() {
return localStorage.getItem('app.locale') || 'en';
}
Use that value during i18n bootstrap, not after the app renders. If initialization and persistence are out of sync, you get a flash of the default language and flaky tests that only fail in CI.
Test the translated UI, not just the converter
A passing conversion script proves very little. The failure that matters is whether a real component renders the expected text with the expected placeholders.
Keep three test layers in place:
- Converter tests:
.poinput becomes the JSON shape you expect. - Rendering tests: React components show the right translated string.
- Regression tests for placeholders: placeholder names and HTML tags survive the pipeline.
import { render, screen } from '@testing-library/react';
import { I18nextProvider } from 'react-i18next';
import i18n from '../i18n';
import { BillingHeader } from './BillingHeader';
test('renders translated welcome message', () => {
render(
<I18nextProvider i18n={i18n}>
<BillingHeader name="Marta" />
</I18nextProvider>
);
expect(screen.getByRole('heading')).toHaveTextContent('Welcome back, Marta');
});
I also recommend one snapshot or explicit assertion per locale for the handful of screens that mix interpolation, pluralization, and markup. Those are the screens that usually break first.
Glossaries beat clever automation
AI can help with throughput. It does not solve terminology drift.
If your .po files are the source of truth, treat product terms as versioned assets alongside them. Keep a short glossary in the repo. Review changes to words like “workspace,” “seat,” “member,” or “project” with the same care you give schema changes, because those terms leak into backend templates, React components, support docs, and screenshots. For the editorial side of that review process, combining AI and human power for accurate translations is a useful reference.
The practical rule is simple. Let automation generate, convert, and validate. Let humans decide terminology, legal copy, and anything user-facing that can create support churn if the wording changes.
A CI job your team will actually keep
The job should run from a clean checkout and fail if generated artifacts are out of date. No hidden state. No manual export step. No “someone forgot to run the script locally.”
python manage.py makemessages -l de
python manage.py makemessages -l fr
python manage.py translate
python manage.py compilemessages
python scripts/po_to_json.py
git diff, locale frontend/src/i18n/translations
A CI pipeline that holds up in practice usually enforces four things:
- Commit generated files: React JSON catalogs belong in version control.
- Fail on dirty diffs: CI should catch uncommitted translation output.
- Review placeholder changes carefully: they’re higher risk than plain text edits.
- Keep locale naming consistent: Django folder names and frontend locale codes need a mapping rule.
For teams wiring that into automation, the CI workflow for translation runs in Django projects is a good reference for job structure and failure conditions.
The standard I want is boring: checkout, generate, compile, convert, diff, fail on mismatch. If CI can reproduce the whole Django to React chain every time, deploy day stops being the moment you find out your frontend and backend disagree about language.
Your i18n Workflow Before the Next Deploy
If your Django backend and React frontend still run separate translation systems, fix that before you add another locale.
Keep one source of truth. Keep one review path. Generate what React needs from what Django already owns.
The deploy checklist that holds up
Run this sequence and commit the outputs:
python manage.py makemessages -l de
python manage.py makemessages -l fr
python manage.py compilemessages
python scripts/po_to_json.py
If translations are added between extraction and conversion, put that step in the middle and keep it inside the same scripted flow.
The rules worth keeping
- Django owns the canonical strings:
.pofiles come first. - React consumes generated JSON: never hand-maintain both.
- Keys and placeholders need conventions: write them down once.
- CI should reproduce the whole chain: extraction, translation, compilation, conversion, diff.
One more thing. Don’t wait for “full localization” to clean this up. The cost of a bad workflow shows up before the cost of extra languages does. It shows up in review time, broken placeholders, and engineers avoiding string changes because they know translation fallout is coming.
If you fix the pipeline, adding another locale becomes routine work instead of a release risk.
If you want a CLI-first way to translate Django .po files without copy-paste or vendor portals, TranslateBot is built for that workflow. It runs from manage.py, preserves placeholders and HTML, writes reviewable diffs back to your locale files, and fits cleanly before the .po to React JSON conversion step described above.