JSON to CSV Converter
Input JSON
Output CSV
Instant Conversion
Paste JSON, click Convert — your CSV is ready in under a second with all fields mapped correctly
Data Preview Table
See your data in a readable table before downloading so you can verify the output looks right
Download as File
Download the CSV directly — open straight in Excel, Google Sheets, or any data tool
Flexible Options
Choose your delimiter, control quoting, include or skip the header row, validate before converting
JSON to CSV Converter Online — Export JSON Arrays to Spreadsheets Instantly in 2026
If you work with APIs or databases, you've almost certainly had this situation: you have JSON data that you need in a spreadsheet. Maybe it's a list of users from an API response. Maybe it's a MongoDB export you need to analyze in Excel. Maybe it's a data dump from a web scraper that you want to pivot in Google Sheets. Whatever the source, the problem is the same — JSON is a developer format, and spreadsheets are what most people actually use for data work. This JSON to CSV converter online bridges that gap in about three seconds.
Paste your JSON array in the left panel, click Convert, and the CSV appears on the right. You can preview it in a formatted table, copy it to clipboard, or download it as a .csv file ready to open in any spreadsheet application. Everything runs in your browser — no data is uploaded anywhere.
Input must be a JSON array: This tool converts arrays of objects (the most common format for data). A single JSON object gets automatically wrapped in an array. If your JSON is deeply nested or isn't an array at the top level, you'll need to extract the array you want to convert first.
What the Converter Does — Step by Step
Understanding what happens during conversion helps you predict the output and make better choices about the options:
1. Parse the JSON. The tool reads your JSON and validates it. If there's a syntax error, you get an error message instead of broken output.
2. Collect all unique keys. The converter scans every object in the array and builds a complete list of all field names that appear anywhere. This matters for inconsistent data — if most records have 8 fields but a few have a 9th field, that 9th field still gets a column, and records without it get an empty cell.
3. Write the header row (optional). The collected field names become the CSV column headers. You can turn this off if your destination system doesn't want a header row.
4. Write one row per JSON object. Each object in the array becomes one CSV row. Values are mapped to the correct column by key name.
5. Handle non-string values. Arrays and nested objects within a field get serialized as JSON strings in that cell (e.g., ["JavaScript","React"] becomes a quoted string in the CSV). Booleans become true or false. Null values become empty cells. Numbers are written as-is.
Understanding the Delimiter Options
The delimiter is the character that separates values in each row. The choice matters depending on where you're taking the CSV:
Comma ( , ) is the standard for most English-language systems. Excel on Windows, Google Sheets, and most data tools default to comma-delimited CSV. Use this unless you have a specific reason not to.
Semicolon ( ; ) is the default in Excel for many European regions (Germany, France, Netherlands, etc.) because those locales use commas as decimal separators. If you're working in a European locale and your numbers show up wrong or columns aren't splitting correctly in Excel, try semicolon instead of comma.
Tab ( \t ) produces TSV (Tab-Separated Values) format. This is useful when your data contains commas or semicolons within field values and you don't want quoting complexity. Tab characters rarely appear in data values, so tab-delimited files are often cleaner for text-heavy data. Some database import tools also prefer TSV.
How Inconsistent JSON Structures Are Handled
Real-world JSON from APIs and databases is often inconsistent. Some records have fields that others don't. This is especially common with NoSQL databases like MongoDB or Firebase, where there's no enforced schema and different documents can have different shapes.
This tool handles it by scanning all records first to build the complete set of columns, then filling empty cells for any record that doesn't have a particular field. Here's an example of what that looks like:
| JSON Record | Fields Present | CSV Output (missing fields) |
|---|---|---|
| Record 1 | id, name, email, phone | All 5 columns filled |
| Record 2 | id, name, email | phone column = empty cell |
| Record 3 | id, name, email, phone, address | All 5 columns + address column added |
This means the output always captures every field that exists anywhere in your data, with empty cells where individual records don't have that field. No data is silently dropped.
Nested Objects and Arrays in JSON — What to Expect
JSON supports nested objects and arrays, which don't have a direct equivalent in CSV's flat two-dimensional structure. The tool handles these by serializing them as strings within the cell:
- Arrays become JSON-encoded strings. A skills field like ["JavaScript", "React", "Node.js"] becomes "[""JavaScript"",""React"",""Node.js""]" in the CSV — the internal quotes doubled because they're inside a quoted CSV field.
- Nested objects become JSON-encoded strings. An address field like {"city": "London", "country": "UK"} becomes a single serialized string in one cell.
- Null values become empty cells. A field with null is treated the same as a missing field.
- Booleans become true/false strings. true and false are written as their string equivalents.
If you need the nested object's fields as separate columns — for example, you want address.city and address.country as their own columns rather than one combined cell — you'd need to flatten the JSON before converting. You can do this programmatically in JavaScript with a simple recursive flatten function, or use a tool that specifically supports dot-notation flattening.
Quick JavaScript flatten for pre-processing: If you need to flatten nested objects before converting, open your browser console and run: JSON.stringify(data.map(r => Object.assign({}, ...flattenObj(r)))) where flattenObj is a recursive function that prefixes nested keys with their parent key names. Then paste the result into this tool.
Using the Data Preview Table
The preview table shows the first 10 rows of your converted data in a readable grid. This is useful for a quick sanity check before downloading — you can verify the columns are named correctly, the values look right, and the structure matches what you're expecting to import.
A few things worth checking in the preview before downloading:
- Column count and names — are all the fields you expected present? Any unexpected columns from fields you didn't realize were in your data?
- Array fields — if a column contains serialized JSON arrays, decide whether that's acceptable or whether you need to pre-process the data
- Empty cells — gaps in the preview tell you which records were missing which fields
- Number formatting — large numbers and decimals should display exactly as they appeared in the JSON without any rounding
- Unicode and special characters — if your data contains non-ASCII characters, verify they look right in the preview
Practical Use Cases — When You'd Reach for This Tool
The situations where a json to csv converter for data analysis is most useful tend to be the same ones over and over:
- API response to spreadsheet — you called an API, got back a JSON array of records, and a stakeholder or business analyst needs to work with that data in Excel or Google Sheets. Paste the response body, convert, download.
- MongoDB export — you ran mongoexport and got a JSON file. You want to do some quick analysis in a spreadsheet or import into a relational database that expects CSV.
- Postman or Insomnia data — you've been testing API endpoints and have response data saved. Converting to CSV lets you compare sets of responses or analyze the data in bulk.
- Firebase Realtime Database export — Firebase exports as JSON. If you need to analyze usage data, user records, or content in a spreadsheet, this tool handles the conversion.
- Log data analysis — structured logs that your application writes as JSON objects can be converted to CSV for analysis in tools that work better with tabular data.
- Sharing data with non-developers — someone who isn't comfortable with JSON needs to see or work with the data. CSV is universally understood.
Frequently Asked Questions
My JSON isn't an array — it's a single object. Will this work?
Yes. The tool detects when you've pasted a single JSON object (rather than an array) and automatically wraps it in an array before converting. The result will be a CSV with one data row (plus the header). If your JSON is a wrapper object that contains an array inside a field — like {"results": [...]} from a paginated API — you'll need to extract the inner array first. Copy just the array value (the part between the outer brackets), paste that into the tool, and it will convert correctly.
Why does my CSV not open correctly in Excel?
The most common cause is the delimiter mismatch. If you're in a European locale, Excel may default to semicolons instead of commas. Try the semicolon option in this tool and re-download. Another common cause is Excel's Text Import Wizard misidentifying the file encoding — if you see garbled characters instead of special characters, your data may have Unicode content that needs UTF-8 encoding. Excel handles this better when you import via "Get Data" or "From Text/CSV" rather than double-clicking the file. Google Sheets generally handles Unicode CSV more reliably than Excel.
What happens to JSON arrays inside the data?
Arrays within field values (like a list of tags or skills) become serialized JSON strings inside a single CSV cell. The array ["a", "b", "c"] becomes the string ["a","b","c"] in the CSV. If you need these as separate columns or rows, you'd need to pre-process the JSON before converting — either by exploding the array (creating one row per array element) or extracting specific array indices as separate fields. That kind of transformation is beyond what a simple JSON-to-CSV tool handles.
Is there a size limit on the JSON I can convert?
There's no hard limit enforced by the tool — everything runs in your browser and the limit is effectively your browser's available memory. In practice, files up to a few megabytes convert quickly and smoothly. Very large datasets (tens of thousands of rows, each with many fields) will still work but may take a few seconds and use more memory. If you're converting extremely large JSON exports, consider breaking them into smaller chunks or using a command-line tool like jq which is purpose-built for processing large JSON files efficiently.
Does the download include a BOM (Byte Order Mark) for Excel compatibility?
The downloaded file uses UTF-8 encoding without BOM. Most modern spreadsheet applications handle this correctly. If you're working with legacy Excel versions on Windows that require a BOM to recognize UTF-8, you may need to open the file via "Get Data" → "From Text/CSV" and specify the encoding manually, rather than double-clicking to open. Google Sheets, LibreOffice Calc, and modern Excel versions handle UTF-8 without BOM without issues.
Is my data safe when I use this tool?
Yes. All conversion processing runs locally in your browser using JavaScript. Nothing you paste is transmitted to any server. Your JSON — whether it contains customer data, API responses with credentials, or any other sensitive information — never leaves your computer. Once you close the tab, nothing is retained. There's no account, no history, no external network requests involving your data.
Can I convert the CSV back to JSON?
This tool is specifically for JSON to CSV. For the reverse conversion, check our CSV to JSON converter tool on the platform. Keep in mind that the reverse conversion loses some information — nested objects and arrays that were flattened into string cells don't automatically reconstruct into their original hierarchical form. Round-trip fidelity depends on how complex your original JSON structure was.
Tips for Getting Clean CSV Output
A few practices that produce better results when working with this tool:
- Flatten nested objects before converting if you need them as columns. The tool serializes nested objects as strings. If you want address.city as its own column, pre-process the data in JavaScript or Python first.
- Remove fields you don't need before converting. If your JSON has 30 fields but you only need 5, strip the rest first. This produces a cleaner CSV and is easier to work with in a spreadsheet.
- Use the data preview to verify before downloading. Two seconds of checking the preview saves the frustration of opening a broken file in Excel.
- For European locales, use semicolons. If you or the recipient uses Excel in a European locale, the semicolon delimiter will produce a file that opens without the column merging that happens when Excel expects semicolons but gets commas.
- Use TSV for data that contains commas. If your data values frequently contain commas (like addresses or free-text fields), tab-delimited output avoids the quoting complexity that can confuse some import tools.