Compare commits

..

3 Commits

Author SHA1 Message Date
Ignacio Ballesteros
ec1426241c Ignore generated content/ output and erl_crash.dump 2026-02-19 18:21:02 +01:00
Ignacio Ballesteros
692f23bc36 Add org-roam workflow: ox-hugo export pipeline and example notes
- Add flake.nix dev shell with Node.js 22, Elixir, and Emacs+ox-hugo
- Add scripts/export.exs: exports org-roam notes to content/ via ox-hugo,
  mirroring subdirectory structure and generating a fallback index.md
- Add npm scripts: export, build:notes, serve:notes
- Configure FrontMatter plugin for TOML (ox-hugo default output)
- Replace ObsidianFlavoredMarkdown with OxHugoFlavouredMarkdown
- Add example notes: Madrid public transport (metro, bus, roads)
- Update README and AGENTS.md with org-roam workflow instructions
2026-02-19 18:20:43 +01:00
Ignacio Ballesteros
86eaaae945 Add branch workflow documentation to AGENTS.md 2026-02-18 12:44:49 +01:00
60 changed files with 713 additions and 7301 deletions

4
.gitignore vendored
View File

@@ -9,3 +9,7 @@ tsconfig.tsbuildinfo
private/
.replit
replit.nix
erl_crash.dump
# content/ is generated by the export script; only keep the placeholder
content/*
!content/.gitkeep

282
AGENTS.md Normal file
View File

@@ -0,0 +1,282 @@
# AGENTS.md - Coding Agent Instructions
This document provides essential information for AI coding agents working in this repository.
## Project Overview
**Quartz** is a static site generator for publishing digital gardens and notes as websites.
Built with TypeScript, Preact, and unified/remark/rehype for markdown processing.
| Stack | Technology |
| ------------- | ----------------------------------------- |
| Language | TypeScript 5.x (strict mode) |
| Runtime | Node.js >=22 (v22.16.0 pinned) |
| Package Mgr | npm >=10.9.2 |
| Module System | ES Modules (`"type": "module"`) |
| UI Framework | Preact 10.x (JSX with `react-jsx` pragma) |
| Build Tool | esbuild |
| Styling | SCSS via esbuild-sass-plugin |
## Environment
This is a Nix project. Use the provided `flake.nix` to enter a dev shell with Node.js 22 and npm:
```bash
nix develop
```
All `npm` commands below must be run inside the dev shell.
## Build, Lint, and Test Commands
```bash
# Type check and format check (CI validation)
npm run check
# Auto-format code with Prettier
npm run format
# Run all tests
npm run test
# Run a single test file
npx tsx --test quartz/util/path.test.ts
# Run tests matching a pattern (use --test-name-pattern)
npx tsx --test --test-name-pattern="typeguards" quartz/util/path.test.ts
# Build the static site
npx quartz build
# Build and serve with hot reload
npx quartz build --serve
# Profile build performance
npm run profile
```
### Test Files Location
Tests use Node.js native test runner via `tsx`. Test files follow the `*.test.ts` pattern:
- `quartz/util/path.test.ts`
- `quartz/util/fileTrie.test.ts`
- `quartz/components/scripts/search.test.ts`
## Code Style Guidelines
### Prettier Configuration (`.prettierrc`)
```json
{
"printWidth": 100,
"tabWidth": 2,
"semi": false,
"trailingComma": "all",
"quoteProps": "as-needed"
}
```
**No ESLint** - only Prettier for formatting. Run `npm run format` before committing.
### TypeScript Configuration
- **Strict mode enabled** (`strict: true`)
- `noUnusedLocals: true` - no unused variables
- `noUnusedParameters: true` - no unused function parameters
- JSX configured for Preact (`jsxImportSource: "preact"`)
### Import Conventions
```typescript
// 1. External packages first
import { PluggableList } from "unified"
import { visit } from "unist-util-visit"
// 2. Internal utilities/types (relative paths)
import { QuartzTransformerPlugin } from "../types"
import { FilePath, slugifyFilePath } from "../../util/path"
import { i18n } from "../../i18n"
```
### Naming Conventions
| Element | Convention | Example |
| ---------------- | ------------ | ----------------------------------- |
| Files (utils) | camelCase | `path.ts`, `fileTrie.ts` |
| Files (comps) | PascalCase | `TableOfContents.tsx`, `Search.tsx` |
| Types/Interfaces | PascalCase | `QuartzComponent`, `FullSlug` |
| Type Guards | `is*` prefix | `isFilePath()`, `isFullSlug()` |
| Constants | UPPER_CASE | `QUARTZ`, `UPSTREAM_NAME` |
| Options types | `Options` | `interface Options { ... }` |
### Branded Types Pattern
This codebase uses branded types for type-safe path handling:
```typescript
type SlugLike<T> = string & { __brand: T }
export type FilePath = SlugLike<"filepath">
export type FullSlug = SlugLike<"full">
export type SimpleSlug = SlugLike<"simple">
// Always validate with type guards before using
export function isFilePath(s: string): s is FilePath { ... }
```
### Component Pattern (Preact)
Components use a factory function pattern with attached static properties:
```typescript
export default ((userOpts?: Partial<Options>) => {
const opts: Options = { ...defaultOptions, ...userOpts }
const ComponentName: QuartzComponent = ({ cfg, displayClass }: QuartzComponentProps) => {
return <div class={classNames(displayClass, "component-name")}>...</div>
}
ComponentName.css = style // SCSS styles
ComponentName.afterDOMLoaded = script // Client-side JS
return ComponentName
}) satisfies QuartzComponentConstructor
```
### Plugin Pattern
Three plugin types: transformers, filters, and emitters.
```typescript
export const PluginName: QuartzTransformerPlugin<Partial<Options>> = (userOpts) => {
const opts = { ...defaultOptions, ...userOpts }
return {
name: "PluginName",
markdownPlugins(ctx) { return [...] },
htmlPlugins(ctx) { return [...] },
externalResources(ctx) { return { js: [], css: [] } },
}
}
```
### Testing Pattern
Use Node.js native test runner with `assert`:
```typescript
import test, { describe, beforeEach } from "node:test"
import assert from "node:assert"
describe("FeatureName", () => {
test("should do something", () => {
assert.strictEqual(actual, expected)
assert.deepStrictEqual(actualObj, expectedObj)
assert(condition) // truthy assertion
assert(!condition) // falsy assertion
})
})
```
### Error Handling
- Use `try/catch` for critical operations (file I/O, parsing)
- Custom `trace` utility for error reporting with stack traces
- `process.exit(1)` for fatal errors
- `console.warn()` for non-fatal issues
### Async Patterns
- Prefer `async/await` over raw promises
- Use async generators (`async *emit()`) for streaming file output
- Use `async-mutex` for concurrent build protection
## Project Structure
```
quartz/
├── bootstrap-cli.mjs # CLI entry point
├── build.ts # Build orchestration
├── cfg.ts # Configuration types
├── components/ # Preact UI components
│ ├── *.tsx # Components
│ ├── scripts/ # Client-side scripts (*.inline.ts)
│ └── styles/ # Component SCSS
├── plugins/
│ ├── transformers/ # Markdown AST transformers
│ ├── filters/ # Content filters
│ ├── emitters/ # Output generators
│ └── types.ts # Plugin type definitions
├── processors/ # Build pipeline (parse/filter/emit)
├── util/ # Utility functions
└── i18n/ # Internationalization (30+ locales)
```
## Branch Workflow
This is a fork of [jackyzha0/quartz](https://github.com/jackyzha0/quartz) with org-roam customizations.
| Branch | Purpose |
| ----------- | ------------------------------------------------ |
| `main` | Clean mirror of upstream quartz — no custom code |
| `org-roam` | Default branch — all customizations live here |
| `feature/*` | Short-lived branches off `org-roam` |
### Pulling Upstream Updates
```bash
git checkout main
git fetch upstream
git merge upstream/main
git checkout org-roam
git merge main
# Resolve conflicts if any, then commit
```
### Working on Features
```bash
git checkout org-roam
git checkout -b feature/my-feature
# ... work ...
git checkout org-roam
git merge feature/my-feature
git branch -d feature/my-feature
```
**Merge direction:** `upstream → main → org-roam → feature/*`
## Org-Roam Workflow
Notes live in a **separate directory** outside this repo. The export script
converts them to Markdown via ox-hugo, then Quartz builds the site.
### Tooling
The dev shell (`nix develop`) provides:
- `nodejs_22` — Quartz build
- `elixir` — runs the export script
- `emacs` + `ox-hugo` — performs the org → markdown conversion
### Export and build
```bash
# Export only (wipes content/, exports all .org files)
NOTES_DIR=/path/to/notes npm run export
# Export then build the site
NOTES_DIR=/path/to/notes npm run build:notes
# Positional arg also works
elixir scripts/export.exs /path/to/notes
```
The export script (`scripts/export.exs`) runs Emacs in batch mode, calling
`org-hugo-export-to-md` per file (per-file mode, not per-subtree). It uses
YAML frontmatter (ox-hugo default). `content/` is wiped before each export.
## Important Notes
- **Client-side scripts**: Use `.inline.ts` suffix, bundled via esbuild
- **Isomorphic code**: `quartz/util/path.ts` must not use Node.js APIs
- **Incremental builds**: Plugins can implement `partialEmit` for efficiency
- **Markdown flavors**: Supports Obsidian (`ofm.ts`) and Roam (`roam.ts`) syntax

View File

@@ -1,13 +1,71 @@
# Quartz v4
# Quartz v4 — org-roam edition
> [One] who works with the door open gets all kinds of interruptions, but [they] also occasionally gets clues as to what the world is and what might be important. — Richard Hamming
> "[One] who works with the door open gets all kinds of interruptions, but [they] also occasionally gets clues as to what the world is and what might be important." — Richard Hamming
Quartz is a set of tools that helps you publish your [digital garden](https://jzhao.xyz/posts/networked-thought) and notes as a website for free.
🔗 Read the documentation and get started: https://quartz.jzhao.xyz/
This fork adds first-class support for [org-roam](https://www.orgroam.com/) notes via [ox-hugo](https://ox-hugo.scripter.co/).
🔗 Upstream documentation: https://quartz.jzhao.xyz/
[Join the Discord Community](https://discord.gg/cRFFHYye7t)
## Quick Start
### Prerequisites
This project uses Nix. Enter the development shell, which provides Node.js 22, Elixir, and Emacs with ox-hugo:
```bash
nix develop
```
All commands below must be run inside this shell.
```bash
npm install
```
### Building from org-roam notes
Your org-roam notes live in a separate directory. Point `NOTES_DIR` at it:
```bash
# Export notes to content/ and build the site
NOTES_DIR=/path/to/notes npm run build:notes
# Export, build, and serve with hot reload
NOTES_DIR=/path/to/notes npm run serve:notes
# Export only (wipes content/ and re-exports all .org files)
NOTES_DIR=/path/to/notes npm run export
```
The export script mirrors the subdirectory structure of your notes into `content/`
and generates a fallback `index.md` if your notes don't include one.
### Building without org-roam notes
If you manage `content/` directly with Markdown files:
```bash
# Build the site
npx quartz build
# Build and serve with hot reload
npx quartz build --serve
```
The site is generated in `public/`. When serving, visit http://localhost:8080.
### Development
```bash
npm run check # type check + format check
npm run format # auto-format with Prettier
npm run test # run tests
```
## Sponsors
<p align="center">

View File

@@ -10,10 +10,8 @@ By default, Quartz ships with the [[ObsidianFlavoredMarkdown]] plugin, which is
It also ships with support for [frontmatter parsing](https://help.obsidian.md/Editing+and+formatting/Properties) with the same fields that Obsidian uses through the [[Frontmatter]] transformer plugin.
Quartz also provides [[CrawlLinks]] plugin, which allows you to customize Quartz's link resolution behaviour to match Obsidian.
For dynamic database-like views, Quartz supports [[bases|Obsidian Bases]] through the [[ObsidianBases]] transformer and [[BasePage]] emitter plugins.
Finally, Quartz also provides [[CrawlLinks]] plugin, which allows you to customize Quartz's link resolution behaviour to match Obsidian.
## Configuration
This functionality is provided by the [[ObsidianFlavoredMarkdown]], [[ObsidianBases]], [[Frontmatter]] and [[CrawlLinks]] plugins. See the plugin pages for customization options.
This functionality is provided by the [[ObsidianFlavoredMarkdown]], [[Frontmatter]] and [[CrawlLinks]] plugins. See the plugin pages for customization options.

View File

@@ -1,42 +0,0 @@
---
title: Bases
tags:
- feature/transformer
- feature/emitter
---
Quartz supports [Obsidian Bases](https://help.obsidian.md/bases), which allow you to create dynamic, database-like views of your notes. See the [official Obsidian documentation](https://help.obsidian.md/bases/syntax) for the full syntax reference.
## Quick Example
Create a `.base` file in your content folder:
```yaml
filters:
and:
- file.hasTag("task")
views:
- type: table
name: "Task List"
order:
- file.name
- status
- due_date
```
Each view gets its own page at `<base-name>/<view-name>`.
## Wikilinks
Link to base views using the standard [[navigation.base#Plugins|wikilink]] syntax:
```markdown
[[my-base.base#Task List]]
```
This resolves to `my-base/Task-List`.
## Configuration
This functionality is provided by the [[ObsidianBases]] transformer plugin (which parses `.base` files) and the [[BasePage]] emitter plugin (which generates the pages).

View File

@@ -1,93 +0,0 @@
filters:
and:
- file.ext == "md"
formulas:
doc_type: |
if(file.hasTag("plugin/transformer"), "transformer",
if(file.hasTag("plugin/emitter"), "emitter",
if(file.hasTag("plugin/filter"), "filter",
if(file.hasTag("component"), "component",
if(file.inFolder("features"), "feature",
if(file.inFolder("advanced"), "advanced",
if(file.inFolder("plugins"), "plugin", "guide")))))))
last_modified: file.mtime.relative()
section: |
if(file.inFolder("plugins"), "plugins",
if(file.inFolder("features"), "features",
if(file.inFolder("advanced"), "advanced",
if(file.inFolder("tags"), "tags", "core"))))
properties:
title:
displayName: Title
formula.doc_type:
displayName: Type
formula.last_modified:
displayName: Updated
formula.section:
displayName: Section
views:
- type: table
name: All Documentation
groupBy:
property: formula.section
direction: ASC
order:
- file.name
- title
- formula.doc_type
- formula.section
- formula.last_modified
sort:
- property: formula.doc_type
direction: ASC
- property: file.name
direction: ASC
columnSize:
file.name: 185
note.title: 268
formula.doc_type: 146
formula.section: 276
- type: table
name: Plugins
filters:
or:
- file.hasTag("plugin/transformer")
- file.hasTag("plugin/emitter")
- file.hasTag("plugin/filter")
groupBy:
property: formula.doc_type
direction: ASC
order:
- file.name
- title
- formula.doc_type
- formula.last_modified
- type: table
name: Components & Features
filters:
or:
- file.hasTag("component")
- file.inFolder("features")
order:
- file.name
- title
- formula.doc_type
- formula.last_modified
- type: list
name: Recently Updated
order:
- file.name
- formula.last_modified
limit: 15
- type: table
name: Core Guides
filters:
not:
- file.inFolder("plugins")
- file.inFolder("features")
- file.inFolder("advanced")
- file.inFolder("tags")
order:
- file.name
- title
- formula.last_modified

View File

@@ -1,18 +0,0 @@
---
title: BasePage
tags:
- plugin/emitter
---
This plugin emits pages for each view defined in `.base` files. See [[bases]] for usage.
> [!note]
> For information on how to add, remove or configure plugins, see the [[configuration#Plugins|Configuration]] page.
Pages use `defaultListPageLayout` from `quartz.layout.ts` with `BaseContent` as the page body. To customize the layout, edit `quartz/components/pages/BaseContent.tsx`.
## API
- Category: Emitter
- Function name: `Plugin.BasePage()`.
- Source: [`quartz/plugins/emitters/basePage.tsx`](https://github.com/jackyzha0/quartz/blob/v4/quartz/plugins/emitters/basePage.tsx).

View File

@@ -1,20 +0,0 @@
---
title: ObsidianBases
tags:
- plugin/transformer
---
This plugin parses `.base` files and compiles them for rendering. See [[bases]] for usage.
> [!note]
> For information on how to add, remove or configure plugins, see the [[configuration#Plugins|Configuration]] page.
## Configuration
- `emitWarnings`: If `true` (default), emits parse errors and type mismatches as warnings during build.
## API
- Category: Transformer
- Function name: `Plugin.ObsidianBases()`.
- Source: [`quartz/plugins/transformers/bases.ts`](https://github.com/jackyzha0/quartz/blob/v4/quartz/plugins/transformers/bases.ts).

61
flake.lock generated Normal file
View File

@@ -0,0 +1,61 @@
{
"nodes": {
"flake-utils": {
"inputs": {
"systems": "systems"
},
"locked": {
"lastModified": 1731533236,
"narHash": "sha256-l0KFg5HjrsfsO/JpG+r7fRrqm12kzFHyUHqHCVpMMbI=",
"owner": "numtide",
"repo": "flake-utils",
"rev": "11707dc2f618dd54ca8739b309ec4fc024de578b",
"type": "github"
},
"original": {
"owner": "numtide",
"repo": "flake-utils",
"type": "github"
}
},
"nixpkgs": {
"locked": {
"lastModified": 1771008912,
"narHash": "sha256-gf2AmWVTs8lEq7z/3ZAsgnZDhWIckkb+ZnAo5RzSxJg=",
"owner": "NixOS",
"repo": "nixpkgs",
"rev": "a82ccc39b39b621151d6732718e3e250109076fa",
"type": "github"
},
"original": {
"owner": "NixOS",
"ref": "nixos-unstable",
"repo": "nixpkgs",
"type": "github"
}
},
"root": {
"inputs": {
"flake-utils": "flake-utils",
"nixpkgs": "nixpkgs"
}
},
"systems": {
"locked": {
"lastModified": 1681028828,
"narHash": "sha256-Vy1rq5AaRuLzOxct8nz4T6wlgyUR7zLU309k9mBC768=",
"owner": "nix-systems",
"repo": "default",
"rev": "da67096a3b9bf56a91d16901293e51ba5b49a27e",
"type": "github"
},
"original": {
"owner": "nix-systems",
"repo": "default",
"type": "github"
}
}
},
"root": "root",
"version": 7
}

30
flake.nix Normal file
View File

@@ -0,0 +1,30 @@
{
description = "Quartz org-roam dev shell";
inputs = {
nixpkgs.url = "github:NixOS/nixpkgs/nixos-unstable";
flake-utils.url = "github:numtide/flake-utils";
};
outputs = { self, nixpkgs, flake-utils }:
flake-utils.lib.eachDefaultSystem (system:
let
pkgs = import nixpkgs { inherit system; };
in
{
devShells.default = pkgs.mkShell {
buildInputs = [
pkgs.nodejs_22
pkgs.elixir
((pkgs.emacsPackagesFor pkgs.emacs-nox).emacsWithPackages (epkgs: [ epkgs.ox-hugo ]))
pkgs.mcp-nixos
];
shellHook = ''
echo "Node $(node --version) / npm $(npm --version)"
elixir --version 2>/dev/null | head -1 || true
echo "Emacs $(emacs --version | head -1)"
'';
};
});
}

16
notes/bus/emt-madrid.org Normal file
View File

@@ -0,0 +1,16 @@
:PROPERTIES:
:ID: emt-madrid
:END:
#+title: EMT Madrid (urban bus)
Empresa Municipal de Transportes (EMT) operates the urban bus network
within the municipality of Madrid — around 200 lines.
* Notable lines
- *Line 27* — connects Embajadores with Barrio de la Concepción, one of the
oldest routes in the network.
- *Line 34* — Argüelles to Carabanchel, crossing the city centre via Gran Vía.
- *Búho (owl) lines* — night buses running from Cibeles from midnight to 6 am.
* See also
- [[id:madrid-transport][Madrid Public Transport]]

View File

@@ -0,0 +1,17 @@
:PROPERTIES:
:ID: madrid-transport
:END:
#+title: Madrid Public Transport
Madrid has one of the most extensive public transport networks in Europe,
operated primarily by [[id:crtm][Consorcio Regional de Transportes de Madrid]] (CRTM).
* Modes
- [[id:metro-madrid][Metro de Madrid]] — 13 lines, ~300 km of track
- [[id:emt-madrid][EMT Bus]] — urban buses within the city
- Cercanías — suburban rail run by Renfe
- Interurbano — regional buses to the wider Community of Madrid
* Ticketing
A single [[https://www.crtm.es][tarjeta transporte]] (transport card) works across all modes.
The Multi card covers zones AC and is topped up at any metro station.

View File

@@ -0,0 +1,18 @@
:PROPERTIES:
:ID: metro-madrid
:END:
#+title: Metro de Madrid
The Madrid Metro is the main rapid transit network in the city, opened in 1919.
It is the second oldest metro in the Iberian Peninsula after Barcelona.
* Key Lines
| Line | Name | Colour | Terminals |
|------+-----------------+--------+------------------------------|
| L1 | Pinar de ChamartínValdecarros | Blue | Pinar de Chamartín / Valdecarros |
| L6 | Circular | Grey | Circular (loop) |
| L10 | — | Dark blue | Hospital Infanta Sofía / Tres Olivos |
* See also
- [[id:madrid-transport][Madrid Public Transport]]
- [[id:sol-interchange][Sol interchange]]

View File

@@ -0,0 +1,12 @@
:PROPERTIES:
:ID: sol-interchange
:END:
#+title: Sol (interchange)
Sol is the busiest interchange station in the Madrid Metro, sitting beneath
Puerta del Sol in the city centre.
Lines serving Sol: [[id:metro-madrid][L1]], L2, L3.
It also connects to the Cercanías hub underneath, making it the de-facto
zero point of Madrid's public transport.

22
notes/roads/crtm.org Normal file
View File

@@ -0,0 +1,22 @@
:PROPERTIES:
:ID: crtm
:END:
#+title: CRTM — Consorcio Regional de Transportes de Madrid
The CRTM is the regional authority that coordinates public transport across
the Community of Madrid. It does not operate services directly but sets
fares, zones, and integration policy.
* Fare zones
| Zone | Coverage |
|-------+-----------------------------|
| A | Municipality of Madrid |
| B1 | Inner ring municipalities |
| B2 | Outer ring municipalities |
| B3 | Further suburban area |
| C1C2 | Commuter belt |
* Related
- [[id:madrid-transport][Madrid Public Transport]]
- [[id:metro-madrid][Metro de Madrid]]
- [[id:emt-madrid][EMT Madrid]]

19
notes/roads/m30.org Normal file
View File

@@ -0,0 +1,19 @@
:PROPERTIES:
:ID: m30
:END:
#+title: M-30
The M-30 is Madrid's innermost ring road, circling the city centre at a
radius of roughly 35 km from Puerta del Sol.
It runs mostly underground through the Madrid Río tunnel section along the
Manzanares river, built during the 20042007 renovation that reclaimed the
riverbank as a public park.
* Key junctions
- Nudo Norte — connects to A-1 (Burgos) and A-6 (La Coruña)
- Nudo Sur — connects to A-4 (Cádiz) and A-42 (Toledo)
* See also
- [[id:crtm][CRTM]]
- [[id:madrid-transport][Madrid Public Transport]]

10
opencode.json Normal file
View File

@@ -0,0 +1,10 @@
{
"$schema": "https://opencode.ai/config.json",
"mcp": {
"nixos": {
"type": "local",
"command": ["mcp-nixos"],
"enabled": true
}
}
}

View File

@@ -17,7 +17,10 @@
"check": "tsc --noEmit && npx prettier . --check",
"format": "npx prettier . --write",
"test": "tsx --test",
"profile": "0x -D prof ./quartz/bootstrap-cli.mjs build --concurrency=1"
"profile": "0x -D prof ./quartz/bootstrap-cli.mjs build --concurrency=1",
"export": "elixir scripts/export.exs",
"build:notes": "elixir scripts/export.exs && npx quartz build",
"serve:notes": "elixir scripts/export.exs && npx quartz build --serve"
},
"engines": {
"npm": ">=10.9.2",

View File

@@ -55,7 +55,7 @@ const config: QuartzConfig = {
},
plugins: {
transformers: [
Plugin.FrontMatter(),
Plugin.FrontMatter({ delimiters: "+++", language: "toml" }),
Plugin.CreatedModifiedDate({
priority: ["frontmatter", "git", "filesystem"],
}),
@@ -66,13 +66,16 @@ const config: QuartzConfig = {
},
keepBackground: false,
}),
Plugin.ObsidianFlavoredMarkdown({ enableInHtmlEmbed: false }),
// OxHugoFlavouredMarkdown must come before GitHubFlavoredMarkdown.
// Note: not compatible with ObsidianFlavoredMarkdown — use one or the other.
// If ox-hugo exports TOML frontmatter, change FrontMatter to:
// Plugin.FrontMatter({ delims: "+++", language: "toml" })
Plugin.OxHugoFlavouredMarkdown(),
Plugin.GitHubFlavoredMarkdown(),
Plugin.TableOfContents(),
Plugin.CrawlLinks({ markdownLinkResolution: "shortest" }),
Plugin.Description(),
Plugin.Latex({ renderEngine: "katex" }),
Plugin.ObsidianBases(),
],
filters: [Plugin.RemoveDrafts()],
emitters: [
@@ -91,7 +94,6 @@ const config: QuartzConfig = {
Plugin.NotFoundPage(),
// Comment out CustomOgImages to speed up build time
Plugin.CustomOgImages(),
Plugin.BasePage(),
],
},
}

View File

@@ -72,7 +72,7 @@ async function buildQuartz(argv: Argv, mut: Mutex, clientRefresh: () => void) {
perf.addEvent("glob")
const allFiles = await glob("**/*.*", argv.directory, cfg.configuration.ignorePatterns)
const markdownPaths = allFiles.filter((fp) => fp.endsWith(".md") || fp.endsWith(".base")).sort()
const markdownPaths = allFiles.filter((fp) => fp.endsWith(".md")).sort()
console.log(
`Found ${markdownPaths.length} input files from \`${argv.directory}\` in ${perf.timeSince("glob")}`,
)

View File

@@ -1,218 +0,0 @@
import { JSX } from "preact"
import { QuartzComponent, QuartzComponentConstructor, QuartzComponentProps } from "./types"
import { classNames } from "../util/lang"
import { resolveRelative } from "../util/path"
// @ts-ignore
import script from "./scripts/base-view-selector.inline"
import baseViewSelectorStyle from "./styles/baseViewSelector.scss"
const icons: Record<string, JSX.Element> = {
table: (
<svg
xmlns="http://www.w3.org/2000/svg"
width="16"
height="16"
viewBox="0 0 24 24"
fill="none"
stroke="currentColor"
stroke-width="2"
stroke-linecap="round"
stroke-linejoin="round"
>
<rect width="18" height="18" x="3" y="3" rx="2" />
<path d="M3 9h18" />
<path d="M3 15h18" />
<path d="M9 3v18" />
<path d="M15 3v18" />
</svg>
),
list: (
<svg
xmlns="http://www.w3.org/2000/svg"
width="16"
height="16"
viewBox="0 0 24 24"
fill="none"
stroke="currentColor"
stroke-width="2"
stroke-linecap="round"
stroke-linejoin="round"
>
<line x1="8" x2="21" y1="6" y2="6" />
<line x1="8" x2="21" y1="12" y2="12" />
<line x1="8" x2="21" y1="18" y2="18" />
<line x1="3" x2="3.01" y1="6" y2="6" />
<line x1="3" x2="3.01" y1="12" y2="12" />
<line x1="3" x2="3.01" y1="18" y2="18" />
</svg>
),
chevronsUpDown: (
<svg
xmlns="http://www.w3.org/2000/svg"
width="16"
height="16"
viewBox="0 0 24 24"
fill="none"
stroke="currentColor"
stroke-width="2"
stroke-linecap="round"
stroke-linejoin="round"
>
<path d="m7 15 5 5 5-5" />
<path d="m7 9 5-5 5 5" />
</svg>
),
chevronRight: (
<svg
xmlns="http://www.w3.org/2000/svg"
width="16"
height="16"
viewBox="0 0 24 24"
fill="none"
stroke="currentColor"
stroke-width="2"
stroke-linecap="round"
stroke-linejoin="round"
>
<path d="m9 18 6-6-6-6" />
</svg>
),
x: (
<svg
xmlns="http://www.w3.org/2000/svg"
width="16"
height="16"
viewBox="0 0 24 24"
fill="none"
stroke="currentColor"
stroke-width="2"
stroke-linecap="round"
stroke-linejoin="round"
>
<path d="M18 6 6 18" />
<path d="m6 6 12 12" />
</svg>
),
map: (
<svg
xmlns="http://www.w3.org/2000/svg"
width="16"
height="16"
viewBox="0 0 24 24"
fill="none"
stroke="currentColor"
stroke-width="2"
stroke-linecap="round"
stroke-linejoin="round"
>
<path d="M15 6v12a3 3 0 1 0 3-3H6a3 3 0 1 0 3 3V6a3 3 0 1 0-3 3h12a3 3 0 1 0-3-3" />
</svg>
),
card: (
<svg
xmlns="http://www.w3.org/2000/svg"
width="16"
height="16"
viewBox="0 0 24 24"
fill="none"
stroke="currentColor"
stroke-width="2"
stroke-linecap="round"
stroke-linejoin="round"
>
<rect width="7" height="7" x="3" y="3" rx="1" />
<rect width="7" height="7" x="14" y="3" rx="1" />
<rect width="7" height="7" x="14" y="14" rx="1" />
<rect width="7" height="7" x="3" y="14" rx="1" />
</svg>
),
}
const viewTypeIcons: Record<string, JSX.Element | undefined> = {
table: icons.table,
list: icons.list,
gallery: icons.card,
board: icons.table,
calendar: icons.table,
map: icons.map,
cards: icons.card,
}
const BaseViewSelector: QuartzComponent = ({ fileData, displayClass }: QuartzComponentProps) => {
const baseMeta = fileData.basesMetadata
if (!baseMeta || baseMeta.allViews.length <= 1) {
return null
}
const currentViewName = baseMeta.currentView
const allViews = baseMeta.allViews
const currentIcon =
viewTypeIcons[allViews.find((view) => view.name === currentViewName)?.type ?? ""] ?? icons.table
return (
<div class={classNames(displayClass, "bases-toolbar")} data-base-view-selector>
<div class="bases-toolbar-item bases-toolbar-views-menu">
<span
class="text-icon-button"
aria-label="Select view"
aria-expanded="false"
aria-haspopup="true"
role="button"
tabindex={0}
>
<span class="text-button-icon">{currentIcon}</span>
<span class="text-button-label">{currentViewName.toLowerCase()}</span>
<span class="text-button-icon mod-aux">{icons.chevronsUpDown}</span>
</span>
</div>
<div class="menu-scroll" data-dropdown>
<div class="bases-toolbar-menu-container">
<div class="search-input-container">
<input type="search" placeholder="Search..." data-search-input />
<div class="search-input-clear-button" data-clear-search hidden>
{icons.x}
</div>
</div>
<div class="bases-toolbar-items">
<div class="suggestion-group" data-group="views" data-view-list>
{allViews.map((view) => {
const isActive = view.name === currentViewName
const icon = viewTypeIcons[view.type] || icons.table
const href = resolveRelative(fileData.slug!, view.slug)
return (
<a
href={href}
data-slug={view.slug}
class={
isActive
? "suggestion-item bases-toolbar-menu-item mod-active is-selected"
: "suggestion-item bases-toolbar-menu-item"
}
data-view-name={view.name}
data-view-type={view.type}
>
<div class="bases-toolbar-menu-item-info">
<div class="bases-toolbar-menu-item-info-icon">{icon}</div>
<div class="bases-toolbar-menu-item-name">{view.name.toLowerCase()}</div>
</div>
<div class="clickable-icon bases-toolbar-menu-item-icon">
{icons.chevronRight}
</div>
</a>
)
})}
</div>
</div>
</div>
</div>
</div>
)
}
BaseViewSelector.css = baseViewSelectorStyle
BaseViewSelector.afterDOMLoaded = script
export default (() => BaseViewSelector) satisfies QuartzComponentConstructor

View File

@@ -51,9 +51,7 @@ export default ((opts?: Partial<BreadcrumbOptions>) => {
ctx,
}: QuartzComponentProps) => {
const trie = (ctx.trie ??= trieFromAllFiles(allFiles))
const baseMeta = fileData.basesMetadata
const slugParts = (baseMeta ? baseMeta.baseSlug : fileData.slug!).split("/")
const slugParts = fileData.slug!.split("/")
const pathNodes = trie.ancestryChain(slugParts)
if (!pathNodes) {
@@ -66,24 +64,14 @@ export default ((opts?: Partial<BreadcrumbOptions>) => {
crumb.displayName = options.rootName
}
// For last node (current page), set empty path
if (idx === pathNodes.length - 1) {
if (baseMeta) {
crumb.path = resolveRelative(fileData.slug!, simplifySlug(baseMeta.baseSlug))
} else {
crumb.path = ""
}
crumb.path = ""
}
return crumb
})
if (baseMeta && options.showCurrentPage) {
crumbs.push({
displayName: baseMeta.currentView.replaceAll("-", " "),
path: "",
})
}
if (!options.showCurrentPage) {
crumbs.pop()
}

View File

@@ -1,8 +1,6 @@
import Content from "./pages/Content"
import TagContent from "./pages/TagContent"
import FolderContent from "./pages/FolderContent"
import BaseContent from "./pages/BaseContent"
import BaseViewSelector from "./BaseViewSelector"
import NotFound from "./pages/404"
import ArticleTitle from "./ArticleTitle"
import Darkmode from "./Darkmode"
@@ -31,8 +29,6 @@ export {
Content,
TagContent,
FolderContent,
BaseContent,
BaseViewSelector,
Darkmode,
ReaderMode,
Head,

View File

@@ -1,20 +0,0 @@
import { QuartzComponent, QuartzComponentConstructor, QuartzComponentProps } from "../types"
import style from "../styles/basePage.scss"
import { htmlToJsx } from "../../util/jsx"
export default (() => {
const BaseContent: QuartzComponent = (props: QuartzComponentProps) => {
const { fileData, tree } = props
return (
<div class="popover-hint">
<article class={["base-content", ...(fileData.frontmatter?.cssclasses ?? [])].join(" ")}>
{htmlToJsx(fileData.filePath!, fileData.basesRenderedTree ?? tree)}
</article>
</div>
)
}
BaseContent.css = style
return BaseContent
}) satisfies QuartzComponentConstructor

View File

@@ -1,144 +0,0 @@
let documentClickHandler: ((e: MouseEvent) => void) | null = null
function setupBaseViewSelector() {
const selectors = document.querySelectorAll("[data-base-view-selector]")
if (selectors.length === 0) return
if (!documentClickHandler) {
documentClickHandler = (e: MouseEvent) => {
document.querySelectorAll("[data-base-view-selector]").forEach((selector) => {
if (selector.contains(e.target as Node)) return
const trigger = selector.querySelector(".text-icon-button") as HTMLElement | null
if (trigger?.getAttribute("aria-expanded") === "true") {
selector.dispatchEvent(new CustomEvent("close-dropdown"))
}
})
}
document.addEventListener("click", documentClickHandler)
window.addCleanup(() => {
if (documentClickHandler) {
document.removeEventListener("click", documentClickHandler)
documentClickHandler = null
}
})
}
selectors.forEach((selector) => {
if (selector.hasAttribute("data-initialized")) return
selector.setAttribute("data-initialized", "true")
const triggerEl = selector.querySelector(".text-icon-button") as HTMLElement | null
const searchInputEl = selector.querySelector("[data-search-input]") as HTMLInputElement | null
const clearButtonEl = selector.querySelector("[data-clear-search]") as HTMLElement | null
const viewListEl = selector.querySelector("[data-view-list]") as HTMLElement | null
if (!triggerEl || !searchInputEl || !clearButtonEl || !viewListEl) return
const trigger = triggerEl
const searchInput = searchInputEl
const clearButton = clearButtonEl
const viewList = viewListEl
function toggleDropdown() {
if (trigger.getAttribute("aria-expanded") === "true") {
closeDropdown()
return
}
openDropdown()
}
function openDropdown() {
trigger.setAttribute("aria-expanded", "true")
trigger.classList.add("has-active-menu")
setTimeout(() => searchInput.focus(), 10)
}
function closeDropdown() {
trigger.setAttribute("aria-expanded", "false")
trigger.classList.remove("has-active-menu")
searchInput.value = ""
clearButton.hidden = true
filterViews("")
}
function filterViews(query: string) {
const items = viewList.querySelectorAll<HTMLElement>(".bases-toolbar-menu-item")
const lowerQuery = query.toLowerCase()
items.forEach((item) => {
const viewName = (item.getAttribute("data-view-name") || "").toLowerCase()
const viewType = (item.getAttribute("data-view-type") || "").toLowerCase()
const matches = viewName.includes(lowerQuery) || viewType.includes(lowerQuery)
item.style.display = matches ? "" : "none"
})
}
function handleSearchInput() {
const query = searchInput.value
filterViews(query)
clearButton.hidden = query.length === 0
}
function clearSearch() {
searchInput.value = ""
clearButton.hidden = true
filterViews("")
searchInput.focus()
}
const handleTriggerClick = (e: MouseEvent) => {
e.stopPropagation()
toggleDropdown()
}
const handleTriggerKeydown = (e: KeyboardEvent) => {
if (e.key === "Enter" || e.key === " ") {
e.preventDefault()
toggleDropdown()
}
}
const handleSearchKeydown = (e: KeyboardEvent) => {
if (e.key === "Escape") {
if (searchInput.value) {
clearSearch()
} else {
closeDropdown()
}
}
}
const handleClearClick = (e: MouseEvent) => {
e.stopPropagation()
clearSearch()
}
trigger.addEventListener("click", handleTriggerClick)
trigger.addEventListener("keydown", handleTriggerKeydown)
searchInput.addEventListener("input", handleSearchInput)
searchInput.addEventListener("keydown", handleSearchKeydown)
clearButton.addEventListener("click", handleClearClick)
const viewLinks = viewList.querySelectorAll(".bases-toolbar-menu-item")
viewLinks.forEach((link) => {
link.addEventListener("click", closeDropdown)
window.addCleanup(() => link.removeEventListener("click", closeDropdown))
})
selector.addEventListener("close-dropdown", closeDropdown)
window.addCleanup(() => {
trigger.removeEventListener("click", handleTriggerClick)
trigger.removeEventListener("keydown", handleTriggerKeydown)
searchInput.removeEventListener("input", handleSearchInput)
searchInput.removeEventListener("keydown", handleSearchKeydown)
clearButton.removeEventListener("click", handleClearClick)
selector.removeEventListener("close-dropdown", closeDropdown)
selector.removeAttribute("data-initialized")
closeDropdown()
})
})
}
document.addEventListener("nav", setupBaseViewSelector)

View File

@@ -1,299 +0,0 @@
.base-content {
width: 100%;
}
.base-view {
width: 100%;
overflow-x: auto;
}
.base-table {
width: 100%;
border-collapse: collapse;
font-size: 0.875rem;
th,
td {
padding: 0.5rem 0.75rem;
text-align: left;
border-bottom: 1px solid var(--lightgray);
}
th {
font-weight: 600;
color: var(--darkgray);
background: var(--light);
position: sticky;
top: 0;
}
tbody tr:hover {
background: var(--light);
}
a.internal {
color: var(--secondary);
text-decoration: none;
&:hover {
text-decoration: underline;
}
}
}
.base-group-header td {
font-weight: 600;
background: var(--light);
color: var(--dark);
padding-top: 1rem;
}
.base-summary-row {
background: var(--light);
font-weight: 500;
.base-summary-cell {
border-top: 2px solid var(--lightgray);
color: var(--darkgray);
}
}
.base-checkbox {
pointer-events: none;
width: 1rem;
height: 1rem;
accent-color: var(--secondary);
}
.base-list {
list-style: none;
padding: 0;
margin: 0;
li {
padding: 0.375rem 0;
border-bottom: 1px solid var(--lightgray);
&:last-child {
border-bottom: none;
}
}
a.internal {
color: var(--secondary);
text-decoration: none;
&:hover {
text-decoration: underline;
}
}
}
.base-list-container {
display: flex;
flex-direction: column;
gap: 1.5rem;
}
.base-list-group {
.base-list-group-header {
font-size: 1rem;
font-weight: 600;
margin-bottom: 0.5rem;
color: var(--dark);
}
}
.base-list-nested {
list-style: none;
padding-left: 1rem;
margin-top: 0.25rem;
font-size: 0.8125rem;
color: var(--darkgray);
}
.base-list-meta-label {
font-weight: 500;
}
.base-card-grid {
--base-card-min: 200px;
--base-card-aspect: 1.4;
display: grid;
grid-template-columns: repeat(auto-fill, minmax(var(--base-card-min), 1fr));
gap: 1rem;
}
.base-card-container {
display: flex;
flex-direction: column;
gap: 1.5rem;
}
.base-card-group {
.base-card-group-header {
font-size: 1rem;
font-weight: 600;
margin-bottom: 0.75rem;
color: var(--dark);
}
}
.base-card {
display: flex;
flex-direction: column;
border: 1px solid var(--lightgray);
border-radius: 8px;
overflow: hidden;
background: var(--light);
transition: box-shadow 0.15s ease;
&:hover {
box-shadow: 0 4px 12px rgba(0, 0, 0, 0.1);
}
}
.base-card-image-link {
display: block;
aspect-ratio: var(--base-card-aspect);
background-position: center;
background-repeat: no-repeat;
}
.base-card-content {
padding: 0.75rem;
display: flex;
flex-direction: column;
gap: 0.5rem;
}
.base-card-title-link {
text-decoration: none;
color: inherit;
&:hover .base-card-title {
color: var(--secondary);
}
}
.base-card-title {
font-size: 0.9375rem;
font-weight: 600;
margin: 0;
line-height: 1.3;
transition: color 0.15s ease;
}
.base-card-meta {
display: flex;
flex-direction: column;
gap: 0.25rem;
font-size: 0.8125rem;
color: var(--darkgray);
}
.base-card-meta-item {
display: flex;
gap: 0.25rem;
}
.base-card-meta-label {
font-weight: 500;
&::after {
content: ":";
}
}
.base-calendar-container {
display: flex;
flex-direction: column;
gap: 1.5rem;
}
.base-calendar-group {
.base-calendar-group-header {
font-size: 0.9375rem;
font-weight: 600;
margin-bottom: 0.5rem;
color: var(--dark);
font-variant-numeric: tabular-nums;
}
}
.base-map {
width: 100%;
min-height: 400px;
background: var(--light);
border: 1px solid var(--lightgray);
border-radius: 8px;
display: flex;
align-items: center;
justify-content: center;
color: var(--darkgray);
&::before {
content: "Map view requires client-side JavaScript";
font-size: 0.875rem;
}
}
.base-diagnostics {
background: #fff3cd;
border: 1px solid #ffc107;
border-radius: 4px;
padding: 1rem;
margin-bottom: 1rem;
font-size: 0.875rem;
}
.base-diagnostics-title {
font-weight: 600;
margin-bottom: 0.5rem;
color: #856404;
}
.base-diagnostics-meta {
display: flex;
gap: 0.5rem;
margin-bottom: 0.75rem;
color: #856404;
}
.base-diagnostics-page {
font-family: var(--codeFont);
}
.base-diagnostics-list {
list-style: none;
padding: 0;
margin: 0;
display: flex;
flex-direction: column;
gap: 0.5rem;
}
.base-diagnostics-item {
background: white;
padding: 0.5rem;
border-radius: 4px;
}
.base-diagnostics-label {
font-weight: 500;
color: #856404;
}
.base-diagnostics-message {
color: #664d03;
margin: 0.25rem 0;
}
.base-diagnostics-source {
display: block;
font-size: 0.8125rem;
color: #6c757d;
white-space: pre-wrap;
word-break: break-all;
}

View File

@@ -1,275 +0,0 @@
@use "../../styles/variables.scss" as *;
.bases-toolbar {
position: relative;
display: inline-block;
margin: 1rem 0;
font-family: var(--bodyFont);
.bases-toolbar-item {
display: inline-block;
position: relative;
&.bases-toolbar-views-menu {
.text-icon-button {
display: flex;
align-items: center;
gap: 0.375rem;
padding: 0.375rem 0.75rem;
background: var(--light);
border: 1px solid var(--lightgray);
border-radius: 6px;
color: var(--darkgray);
font-size: 0.875rem;
font-weight: 500;
cursor: pointer;
transition: all 0.15s ease;
user-select: none;
&:hover {
background: var(--highlight);
border-color: var(--gray);
}
&.has-active-menu {
border-color: var(--secondary);
background: var(--highlight);
}
.text-button-icon {
display: flex;
align-items: center;
justify-content: center;
width: 16px;
height: 16px;
color: var(--gray);
flex-shrink: 0;
svg {
width: 16px;
height: 16px;
}
&.mod-aux {
opacity: 0.7;
}
}
.text-button-label {
font-size: 0.875rem;
color: var(--dark);
font-weight: 500;
}
}
}
}
.menu-scroll {
position: absolute;
top: calc(100% + 0.5rem);
left: 0;
z-index: 100;
max-height: 400px;
background: var(--light);
border: 1px solid var(--lightgray);
border-radius: 8px;
box-shadow:
0 4px 6px -1px rgb(0 0 0 / 0.1),
0 2px 4px -2px rgb(0 0 0 / 0.1);
overflow: hidden;
min-width: 280px;
display: none;
}
&:has(.text-icon-button.has-active-menu) .menu-scroll {
display: block;
}
.bases-toolbar-menu-container {
display: flex;
flex-direction: column;
max-height: 400px;
.search-input-container {
position: relative;
padding: 0.5rem;
border-bottom: 1px solid var(--lightgray);
input[type="search"] {
width: 100%;
padding: 0.375rem 0.75rem;
padding-right: 2rem;
background: var(--light);
border: 1px solid var(--secondary);
border-radius: 6px;
font-size: 0.875rem;
color: var(--dark);
outline: none;
transition: box-shadow 0.15s ease;
font-family: var(--bodyFont);
&::placeholder {
color: var(--gray);
opacity: 0.7;
}
&:focus {
box-shadow: 0 0 0 2px var(--highlight);
}
&::-webkit-search-cancel-button {
display: none;
}
}
.search-input-clear-button {
position: absolute;
right: 1rem;
top: 50%;
transform: translateY(-50%);
display: flex;
align-items: center;
justify-content: center;
width: 20px;
height: 20px;
cursor: pointer;
opacity: 0.5;
transition: opacity 0.15s ease;
color: var(--gray);
&:hover {
opacity: 1;
}
&[hidden] {
display: none;
}
svg {
width: 14px;
height: 14px;
}
}
}
.bases-toolbar-items {
overflow-y: auto;
max-height: 340px;
&::-webkit-scrollbar {
width: 8px;
}
&::-webkit-scrollbar-track {
background: transparent;
}
&::-webkit-scrollbar-thumb {
background: var(--lightgray);
border-radius: 4px;
&:hover {
background: var(--gray);
}
}
.suggestion-group {
&[data-group="views"] {
padding: 0.25rem 0;
text-transform: lowercase;
}
}
.suggestion-item {
display: block;
text-decoration: none;
color: inherit;
cursor: pointer;
&.bases-toolbar-menu-item {
display: flex;
align-items: center;
justify-content: space-between;
padding: 0.5rem 0.75rem;
margin: 0 0.25rem;
border-radius: 4px;
transition: background 0.15s ease;
&:hover {
background: var(--lightgray);
}
&.mod-active {
font-weight: $semiBoldWeight;
}
&.is-selected {
.bases-toolbar-menu-item-info {
.bases-toolbar-menu-item-name {
font-weight: 600;
color: var(--secondary);
}
}
}
.bases-toolbar-menu-item-info {
display: flex;
align-items: center;
gap: 0.5rem;
flex: 1;
.bases-toolbar-menu-item-info-icon {
display: flex;
align-items: center;
justify-content: center;
width: 16px;
height: 16px;
color: var(--gray);
flex-shrink: 0;
svg {
width: 16px;
height: 16px;
}
}
.bases-toolbar-menu-item-name {
font-size: 0.875rem;
color: var(--dark);
}
}
.clickable-icon.bases-toolbar-menu-item-icon {
display: flex;
align-items: center;
justify-content: center;
width: 16px;
height: 16px;
opacity: 0;
transition: opacity 0.15s ease;
color: var(--gray);
flex-shrink: 0;
svg {
width: 16px;
height: 16px;
}
}
&:hover .clickable-icon.bases-toolbar-menu-item-icon {
opacity: 0.5;
}
}
}
}
}
}
@media all and ($mobile) {
.bases-toolbar {
.menu-scroll {
min-width: 240px;
left: auto;
}
}
}

View File

@@ -7,12 +7,8 @@ import { Argv } from "../../util/ctx"
import { QuartzConfig } from "../../cfg"
const filesToCopy = async (argv: Argv, cfg: QuartzConfig) => {
// glob all non MD/base files in content folder and copy it over
return await glob("**", argv.directory, [
"**/*.md",
"**/*.base",
...cfg.configuration.ignorePatterns,
])
// glob all non MD files in content folder and copy it over
return await glob("**", argv.directory, ["**/*.md", ...cfg.configuration.ignorePatterns])
}
const copyFile = async (argv: Argv, fp: FilePath) => {
@@ -41,7 +37,7 @@ export const Assets: QuartzEmitterPlugin = () => {
async *partialEmit(ctx, _content, _resources, changeEvents) {
for (const changeEvent of changeEvents) {
const ext = path.extname(changeEvent.path)
if (ext === ".md" || ext === ".base") continue
if (ext === ".md") continue
if (changeEvent.type === "add" || changeEvent.type === "change") {
yield copyFile(ctx.argv, changeEvent.path)

View File

@@ -1,184 +0,0 @@
import { QuartzEmitterPlugin } from "../types"
import { QuartzComponentProps } from "../../components/types"
import HeaderConstructor from "../../components/Header"
import BodyConstructor from "../../components/Body"
import { pageResources, renderPage } from "../../components/renderPage"
import { ProcessedContent, QuartzPluginData } from "../vfile"
import { FullPageLayout } from "../../cfg"
import { pathToRoot } from "../../util/path"
import { defaultListPageLayout, sharedPageComponents } from "../../../quartz.layout"
import { BaseContent, BaseViewSelector } from "../../components"
import { write } from "./helpers"
import { BuildCtx } from "../../util/ctx"
import { StaticResources } from "../../util/resources"
import {
renderBaseViewsForFile,
RenderedBaseView,
BaseViewMeta,
BaseMetadata,
} from "../../util/base/render"
import { BaseFile } from "../../util/base/types"
interface BasePageOptions extends FullPageLayout {}
function isBaseFile(data: QuartzPluginData): boolean {
return Boolean(data.basesConfig && (data.basesConfig as BaseFile).views?.length > 0)
}
function getBaseFiles(content: ProcessedContent[]): ProcessedContent[] {
return content.filter(([_, file]) => isBaseFile(file.data))
}
async function processBasePage(
ctx: BuildCtx,
baseFileData: QuartzPluginData,
renderedView: RenderedBaseView,
allViews: BaseViewMeta[],
allFiles: QuartzPluginData[],
opts: FullPageLayout,
resources: StaticResources,
) {
const slug = renderedView.slug
const cfg = ctx.cfg.configuration
const externalResources = pageResources(pathToRoot(slug), resources)
const viewFileData: QuartzPluginData = {
...baseFileData,
slug,
frontmatter: {
...baseFileData.frontmatter,
title: renderedView.view.name,
},
basesRenderedTree: renderedView.tree,
basesAllViews: allViews,
basesCurrentView: renderedView.view.name,
basesMetadata: {
baseSlug: baseFileData.slug!,
currentView: renderedView.view.name,
allViews,
},
}
const componentData: QuartzComponentProps = {
ctx,
fileData: viewFileData,
externalResources,
cfg,
children: [],
tree: renderedView.tree,
allFiles,
}
const content = renderPage(cfg, slug, componentData, opts, externalResources)
return write({
ctx,
content,
slug,
ext: ".html",
})
}
export const BasePage: QuartzEmitterPlugin<Partial<BasePageOptions>> = (userOpts) => {
const baseOpts: FullPageLayout = {
...sharedPageComponents,
...defaultListPageLayout,
pageBody: BaseContent(),
...userOpts,
}
const opts: FullPageLayout = {
...baseOpts,
beforeBody: [
...baseOpts.beforeBody.filter((component) => component.name !== "ArticleTitle"),
BaseViewSelector(),
],
}
const { head: Head, header, beforeBody, pageBody, afterBody, left, right, footer: Footer } = opts
const Header = HeaderConstructor()
const Body = BodyConstructor()
return {
name: "BasePage",
getQuartzComponents() {
return [
Head,
Header,
Body,
...header,
...beforeBody,
pageBody,
...afterBody,
...left,
...right,
Footer,
]
},
async *emit(ctx, content, resources) {
const allFiles = content.map((c) => c[1].data)
const baseFiles = getBaseFiles(content)
for (const [_, file] of baseFiles) {
const baseFileData = file.data
const { views, allViews } = renderBaseViewsForFile(baseFileData, allFiles)
for (const renderedView of views) {
yield processBasePage(
ctx,
baseFileData,
renderedView,
allViews,
allFiles,
opts,
resources,
)
}
}
},
async *partialEmit(ctx, content, resources, changeEvents) {
const allFiles = content.map((c) => c[1].data)
const baseFiles = getBaseFiles(content)
const affectedBaseSlugs = new Set<string>()
for (const event of changeEvents) {
if (!event.file) continue
const slug = event.file.data.slug
if (slug && isBaseFile(event.file.data)) {
affectedBaseSlugs.add(slug)
}
}
for (const [_, file] of baseFiles) {
const baseFileData = file.data
const baseSlug = baseFileData.slug
if (!baseSlug || !affectedBaseSlugs.has(baseSlug)) continue
const { views, allViews } = renderBaseViewsForFile(baseFileData, allFiles)
for (const renderedView of views) {
yield processBasePage(
ctx,
baseFileData,
renderedView,
allViews,
allFiles,
opts,
resources,
)
}
}
},
}
}
declare module "vfile" {
interface DataMap {
basesRenderedTree?: import("hast").Root
basesAllViews?: BaseViewMeta[]
basesCurrentView?: string
basesMetadata?: BaseMetadata
}
}

View File

@@ -83,8 +83,6 @@ export const ContentPage: QuartzEmitterPlugin<Partial<FullPageLayout>> = (userOp
containsIndex = true
}
if (file.data.filePath!.endsWith(".base")) continue
// only process home page, non-tag pages, and non-index pages
if (slug.endsWith("/index") || slug.startsWith("tags/")) continue
yield processContent(ctx, tree, file.data, allFiles, opts, resources)
@@ -114,7 +112,6 @@ export const ContentPage: QuartzEmitterPlugin<Partial<FullPageLayout>> = (userOp
for (const [tree, file] of content) {
const slug = file.data.slug!
if (!changedSlugs.has(slug)) continue
if (file.data.filePath!.endsWith(".base")) continue
if (slug.endsWith("/index") || slug.startsWith("tags/")) continue
yield processContent(ctx, tree, file.data, allFiles, opts, resources)

View File

@@ -10,4 +10,3 @@ export { ComponentResources } from "./componentResources"
export { NotFoundPage } from "./404"
export { CNAME } from "./cname"
export { CustomOgImages } from "./ogImage"
export { BasePage } from "./basePage"

View File

@@ -1,521 +0,0 @@
import * as yaml from "js-yaml"
import { QuartzTransformerPlugin } from "../types"
import { FilePath, getFileExtension } from "../../util/path"
import {
BaseFile,
BaseView,
BaseFileFilter,
parseViews,
parseViewSummaries,
BUILTIN_SUMMARY_TYPES,
BuiltinSummaryType,
} from "../../util/base/types"
import {
parseExpressionSource,
compileExpression,
buildPropertyExpressionSource,
ProgramIR,
BasesExpressions,
BaseExpressionDiagnostic,
Span,
} from "../../util/base/compiler"
export interface BasesOptions {
/** Whether to emit diagnostics as warnings during build */
emitWarnings: boolean
}
const defaultOptions: BasesOptions = {
emitWarnings: true,
}
type FilterStructure =
| string
| { and?: FilterStructure[]; or?: FilterStructure[]; not?: FilterStructure[] }
function compileFilterStructure(
filter: FilterStructure | undefined,
file: string,
diagnostics: BaseExpressionDiagnostic[],
context: string,
): ProgramIR | undefined {
if (!filter) return undefined
if (typeof filter === "string") {
const result = parseExpressionSource(filter, file)
if (result.diagnostics.length > 0) {
for (const diag of result.diagnostics) {
diagnostics.push({
kind: diag.kind as "lex" | "parse" | "runtime",
message: diag.message,
span: diag.span,
context,
source: filter,
})
}
}
if (!result.program.body) return undefined
return compileExpression(result.program.body)
}
const compileParts = (
parts: FilterStructure[],
combiner: "&&" | "||",
negate: boolean,
): ProgramIR | undefined => {
const compiled: ProgramIR[] = []
for (const part of parts) {
const partIR = compileFilterStructure(part, file, diagnostics, context)
if (partIR) compiled.push(partIR)
}
if (compiled.length === 0) return undefined
if (compiled.length === 1) {
if (negate) {
return wrapWithNot(compiled[0])
}
return compiled[0]
}
let result = compiled[0]
for (let i = 1; i < compiled.length; i++) {
result = combineWithLogical(result, compiled[i], combiner, negate)
}
return result
}
if (filter.and && filter.and.length > 0) {
return compileParts(filter.and, "&&", false)
}
if (filter.or && filter.or.length > 0) {
return compileParts(filter.or, "||", false)
}
if (filter.not && filter.not.length > 0) {
return compileParts(filter.not, "&&", true)
}
return undefined
}
function wrapWithNot(ir: ProgramIR): ProgramIR {
const span = ir.span
return {
instructions: [
...ir.instructions,
{ op: "to_bool" as const, span },
{ op: "unary" as const, operator: "!" as const, span },
],
span,
}
}
function combineWithLogical(
left: ProgramIR,
right: ProgramIR,
operator: "&&" | "||",
negateRight: boolean,
): ProgramIR {
const span: Span = {
start: left.span.start,
end: right.span.end,
file: left.span.file,
}
const rightIR = negateRight ? wrapWithNot(right) : right
if (operator === "&&") {
const jumpIfFalseIndex = left.instructions.length + 1
const jumpIndex = jumpIfFalseIndex + rightIR.instructions.length + 2
return {
instructions: [
...left.instructions,
{ op: "jump_if_false" as const, target: jumpIndex, span },
...rightIR.instructions,
{ op: "to_bool" as const, span },
{ op: "jump" as const, target: jumpIndex + 1, span },
{
op: "const" as const,
literal: { type: "Literal" as const, kind: "boolean" as const, value: false, span },
span,
},
],
span,
}
} else {
const jumpIfTrueIndex = left.instructions.length + 1
const jumpIndex = jumpIfTrueIndex + rightIR.instructions.length + 2
return {
instructions: [
...left.instructions,
{ op: "jump_if_true" as const, target: jumpIndex, span },
...rightIR.instructions,
{ op: "to_bool" as const, span },
{ op: "jump" as const, target: jumpIndex + 1, span },
{
op: "const" as const,
literal: { type: "Literal" as const, kind: "boolean" as const, value: true, span },
span,
},
],
span,
}
}
}
function collectPropertiesFromViews(views: BaseView[]): Set<string> {
const properties = new Set<string>()
for (const view of views) {
if (view.order) {
for (const prop of view.order) {
properties.add(prop)
}
}
if (view.groupBy) {
const groupProp = typeof view.groupBy === "string" ? view.groupBy : view.groupBy.property
properties.add(groupProp)
}
if (view.sort) {
for (const sortConfig of view.sort) {
properties.add(sortConfig.property)
}
}
if (view.image) properties.add(view.image)
if (view.date) properties.add(view.date)
if (view.dateField) properties.add(view.dateField)
if (view.dateProperty) properties.add(view.dateProperty)
if (view.coordinates) properties.add(view.coordinates)
if (view.markerIcon) properties.add(view.markerIcon)
if (view.markerColor) properties.add(view.markerColor)
}
return properties
}
function compilePropertyExpressions(
properties: Set<string>,
file: string,
diagnostics: BaseExpressionDiagnostic[],
): Record<string, ProgramIR> {
const expressions: Record<string, ProgramIR> = {}
for (const property of properties) {
const source = buildPropertyExpressionSource(property)
if (!source) continue
const result = parseExpressionSource(source, file)
if (result.diagnostics.length > 0) {
for (const diag of result.diagnostics) {
diagnostics.push({
kind: diag.kind as "lex" | "parse" | "runtime",
message: diag.message,
span: diag.span,
context: `property.${property}`,
source,
})
}
}
if (result.program.body) {
expressions[property] = compileExpression(result.program.body)
}
}
return expressions
}
function compileFormulas(
formulas: Record<string, string> | undefined,
file: string,
diagnostics: BaseExpressionDiagnostic[],
): Record<string, ProgramIR> {
if (!formulas) return {}
const compiled: Record<string, ProgramIR> = {}
for (const [name, source] of Object.entries(formulas)) {
const trimmed = source.trim()
if (!trimmed) continue
const result = parseExpressionSource(trimmed, file)
if (result.diagnostics.length > 0) {
for (const diag of result.diagnostics) {
diagnostics.push({
kind: diag.kind as "lex" | "parse" | "runtime",
message: diag.message,
span: diag.span,
context: `formulas.${name}`,
source: trimmed,
})
}
}
if (result.program.body) {
compiled[name] = compileExpression(result.program.body)
}
}
return compiled
}
function compileSummaries(
summaries: Record<string, string> | undefined,
file: string,
diagnostics: BaseExpressionDiagnostic[],
): Record<string, ProgramIR> {
if (!summaries) return {}
const compiled: Record<string, ProgramIR> = {}
for (const [name, source] of Object.entries(summaries)) {
const trimmed = source.trim()
if (!trimmed) continue
const normalized = trimmed.toLowerCase()
if (BUILTIN_SUMMARY_TYPES.includes(normalized as BuiltinSummaryType)) {
continue
}
const result = parseExpressionSource(trimmed, file)
if (result.diagnostics.length > 0) {
for (const diag of result.diagnostics) {
diagnostics.push({
kind: diag.kind as "lex" | "parse" | "runtime",
message: diag.message,
span: diag.span,
context: `summaries.${name}`,
source: trimmed,
})
}
}
if (result.program.body) {
compiled[name] = compileExpression(result.program.body)
}
}
return compiled
}
function compileViewSummaries(
views: BaseView[],
topLevelSummaries: Record<string, string> | undefined,
file: string,
diagnostics: BaseExpressionDiagnostic[],
): Record<string, Record<string, ProgramIR>> {
const result: Record<string, Record<string, ProgramIR>> = {}
for (let i = 0; i < views.length; i++) {
const view = views[i]
if (!view.summaries) continue
const viewSummaryConfig = parseViewSummaries(
view.summaries as Record<string, string>,
topLevelSummaries,
)
if (!viewSummaryConfig?.columns) continue
const viewExpressions: Record<string, ProgramIR> = {}
for (const [column, def] of Object.entries(viewSummaryConfig.columns)) {
if (def.type !== "formula" || !def.expression) continue
const parseResult = parseExpressionSource(def.expression, file)
if (parseResult.diagnostics.length > 0) {
for (const diag of parseResult.diagnostics) {
diagnostics.push({
kind: diag.kind as "lex" | "parse" | "runtime",
message: diag.message,
span: diag.span,
context: `views[${i}].summaries.${column}`,
source: def.expression,
})
}
}
if (parseResult.program.body) {
viewExpressions[column] = compileExpression(parseResult.program.body)
}
}
if (Object.keys(viewExpressions).length > 0) {
result[String(i)] = viewExpressions
}
}
return result
}
export const ObsidianBases: QuartzTransformerPlugin<Partial<BasesOptions>> = (userOpts) => {
const opts = { ...defaultOptions, ...userOpts }
return {
name: "ObsidianBases",
textTransform(_ctx, src) {
return src
},
markdownPlugins(_ctx) {
return [
() => {
return (_tree, file) => {
const filePath = file.data.filePath as FilePath | undefined
if (!filePath) return
const ext = getFileExtension(filePath)
if (ext !== ".base") return
const content = file.value.toString()
if (!content.trim()) return
const diagnostics: BaseExpressionDiagnostic[] = []
const filePathStr = filePath
try {
const parsed = yaml.load(content, { schema: yaml.JSON_SCHEMA }) as Record<
string,
unknown
>
if (!parsed || typeof parsed !== "object") {
diagnostics.push({
kind: "parse",
message: "Base file must contain a valid YAML object",
span: {
start: { offset: 0, line: 1, column: 1 },
end: { offset: 0, line: 1, column: 1 },
file: filePathStr,
},
context: "root",
source: content.slice(0, 100),
})
file.data.basesDiagnostics = diagnostics
return
}
const rawViews = parsed.views
if (!Array.isArray(rawViews) || rawViews.length === 0) {
diagnostics.push({
kind: "parse",
message: "Base file must have at least one view defined",
span: {
start: { offset: 0, line: 1, column: 1 },
end: { offset: 0, line: 1, column: 1 },
file: filePathStr,
},
context: "views",
source: "views: []",
})
file.data.basesDiagnostics = diagnostics
return
}
const views = parseViews(rawViews)
const filters = parsed.filters as BaseFileFilter | undefined
const properties = parsed.properties as
| Record<string, { displayName?: string }>
| undefined
const summaries = parsed.summaries as Record<string, string> | undefined
const formulas = parsed.formulas as Record<string, string> | undefined
const baseConfig: BaseFile = {
filters,
views,
properties,
summaries,
formulas,
}
const compiledFilters = compileFilterStructure(
filters as FilterStructure | undefined,
filePathStr,
diagnostics,
"filters",
)
const viewFilters: Record<string, ProgramIR> = {}
for (let i = 0; i < views.length; i++) {
const view = views[i]
if (view.filters) {
const compiled = compileFilterStructure(
view.filters as FilterStructure,
filePathStr,
diagnostics,
`views[${i}].filters`,
)
if (compiled) {
viewFilters[String(i)] = compiled
}
}
}
const compiledFormulas = compileFormulas(formulas, filePathStr, diagnostics)
const compiledSummaries = compileSummaries(summaries, filePathStr, diagnostics)
const compiledViewSummaries = compileViewSummaries(
views,
summaries,
filePathStr,
diagnostics,
)
const viewProperties = collectPropertiesFromViews(views)
for (const name of Object.keys(compiledFormulas)) {
viewProperties.add(`formula.${name}`)
}
const propertyExpressions = compilePropertyExpressions(
viewProperties,
filePathStr,
diagnostics,
)
const expressions: BasesExpressions = {
filters: compiledFilters,
viewFilters,
formulas: compiledFormulas,
summaries: compiledSummaries,
viewSummaries: compiledViewSummaries,
propertyExpressions,
}
file.data.basesConfig = baseConfig
file.data.basesExpressions = expressions
file.data.basesDiagnostics = diagnostics
const existingFrontmatter = (file.data.frontmatter ?? {}) as Record<string, unknown>
file.data.frontmatter = {
title: views[0]?.name ?? file.stem ?? "Base",
tags: ["base"],
...existingFrontmatter,
}
if (opts.emitWarnings && diagnostics.length > 0) {
for (const diag of diagnostics) {
console.warn(
`[bases] ${filePathStr}:${diag.span.start.line}:${diag.span.start.column} - ${diag.message}`,
)
}
}
} catch (err) {
const message = err instanceof Error ? err.message : String(err)
diagnostics.push({
kind: "parse",
message: `Failed to parse base file: ${message}`,
span: {
start: { offset: 0, line: 1, column: 1 },
end: { offset: 0, line: 1, column: 1 },
file: filePathStr,
},
context: "root",
source: content.slice(0, 100),
})
file.data.basesDiagnostics = diagnostics
if (opts.emitWarnings) {
console.warn(`[bases] ${filePathStr}: ${message}`)
}
}
}
},
]
},
}
}
declare module "vfile" {
interface DataMap {
basesConfig?: BaseFile
basesExpressions?: BasesExpressions
basesDiagnostics?: BaseExpressionDiagnostic[]
}
}

View File

@@ -11,4 +11,3 @@ export { SyntaxHighlighting } from "./syntax"
export { TableOfContents } from "./toc"
export { HardLineBreaks } from "./linebreaks"
export { RoamFlavoredMarkdown } from "./roam"
export { ObsidianBases } from "./bases"

View File

@@ -289,11 +289,8 @@ export const ObsidianFlavoredMarkdown: QuartzTransformerPlugin<Partial<Options>>
}
}
const isBaseFile = fp.endsWith(".base")
const basePath = isBaseFile ? fp.slice(0, -5) : fp
const url = isBaseFile
? basePath + (anchor ? `/${anchor.slice(1).replace(/\s+/g, "-")}` : "")
: fp + anchor
// internal link
const url = fp + anchor
return {
type: "link",
@@ -301,7 +298,7 @@ export const ObsidianFlavoredMarkdown: QuartzTransformerPlugin<Partial<Options>>
children: [
{
type: "text",
value: alias ?? basePath,
value: alias ?? fp,
},
],
}

View File

@@ -104,16 +104,12 @@ export function createFileParser(ctx: BuildCtx, fps: FilePath[]) {
file.data.relativePath = path.posix.relative(argv.directory, file.path) as FilePath
file.data.slug = slugifyFilePath(file.data.relativePath)
const isBaseFile = fp.endsWith(".base")
const ast: MDRoot = isBaseFile ? { type: "root", children: [] } : processor.parse(file)
const ast = processor.parse(file)
const newAst = await processor.run(ast, file)
res.push([newAst, file])
if (argv.verbose) {
console.log(
`[${isBaseFile ? "base" : "markdown"}] ${fp} -> ${file.data.slug} (${perf.timeSince()})`,
)
console.log(`[markdown] ${fp} -> ${file.data.slug} (${perf.timeSince()})`)
}
} catch (err) {
trace(`\nFailed to process markdown \`${fp}\``, err as Error)

View File

@@ -1,92 +0,0 @@
# bases compiler + runtime (quartz implementation)
status: active
last updated: 2026-01-28
this directory contains the obsidian bases compiler, interpreter, and runtime helpers used by quartz to render `.base` files. it is designed to match obsidian bases syntax and semantics with deterministic evaluation and consistent diagnostics.
You can test it out with any of the base file in my vault here:
```bash
npx tsx quartz/util/base/inspect-base.ts docs/navigation.base > /tmp/ast-ir.json
jq '.expressions[] | {context, kind, source, ast}' /tmp/ast-ir.json
jq '.expressions[] | {context, kind, ir}' /tmp/ast-ir.json
```
## scope
- parse base expressions (filters, formulas, summaries, property expressions)
- compile expressions to bytecode ir
- interpret bytecode with a deterministic stack vm
- resolve file, note, formula, and property values
- render views (table, list, cards/gallery, board, calendar, map)
- surface parse and runtime diagnostics in base output
## architecture (pipeline)
1. parse `.base` yaml (plugin: `quartz/plugins/transformers/bases.ts`)
2. parse expressions into ast (`compiler/parser.ts`)
3. compile ast to ir (`compiler/ir.ts`)
4. evaluate ir per row with caches (`compiler/interpreter.ts`)
5. render views and diagnostics (`render.ts`)
## modules
- `compiler/lexer.ts`: tokenizer with span tracking and regex support
- `compiler/parser.ts`: pratt parser for expression grammar and error recovery
- `compiler/ir.ts`: bytecode instruction set + compiler
- `compiler/interpreter.ts`: stack vm, value model, coercions, methods, functions
- `compiler/diagnostics.ts`: diagnostics types and helpers
- `compiler/schema.ts`: summary config schema and builtins
- `compiler/properties.ts`: property expression builder for columns and config keys
- `render.ts`: view rendering and diagnostics output
- `query.ts`: summaries and view summary helpers
- `types.ts`: base config types and yaml parsing helpers
## value model (runtime)
runtime values are tagged unions with explicit kinds:
- null, boolean, number, string
- date, duration
- list, object
- file, link
- regex, html, icon, image
coercions are permissive to match obsidian behavior. comparisons prefer type-aware equality (links resolve to files when possible, dates compare by time, etc), with fallbacks when resolution fails.
## expression features (spec parity)
- operators: `==`, `!=`, `>`, `<`, `>=`, `<=`, `&&`, `||`, `!`, `+`, `-`, `*`, `/`, `%`
- member and index access
- function calls and method calls
- list literals and regex literals
- `this` binding with embed-aware scoping
- list helpers (`filter`, `map`, `reduce`) using implicit locals `value`, `index`, `acc`
- summary context helpers: `values` (column values) and `rows` (row files)
## diagnostics
- parser diagnostics are collected with spans at compile time
- runtime diagnostics are collected during evaluation and deduped per context
- base views render diagnostics above the view output
## this scoping
- main base file: `this` resolves to the base file
- embedded base: `this` resolves to the embedding file
- row evaluation: `file` resolves to the row file
## performance decisions
- bytecode ir keeps evaluation linear and stable
- per-build backlink index avoids n^2 scans
- property cache memoizes property expressions per file
- formula cache memoizes formula evaluation per file
## view rendering
- table, list, cards/gallery, board, calendar, map
- map rendering expects coordinates `[lat, lon]` and map config fields
- view filters combine with base filters via logical and

View File

@@ -1,76 +0,0 @@
export type Position = { offset: number; line: number; column: number }
export type Span = { start: Position; end: Position; file?: string }
export type Program = { type: "Program"; body: Expr | null; span: Span }
export type Expr =
| Literal
| Identifier
| UnaryExpr
| BinaryExpr
| LogicalExpr
| CallExpr
| MemberExpr
| IndexExpr
| ListExpr
| ErrorExpr
export type LiteralKind = "number" | "string" | "boolean" | "null" | "date" | "duration" | "regex"
export type NumberLiteral = { type: "Literal"; kind: "number"; value: number; span: Span }
export type StringLiteral = { type: "Literal"; kind: "string"; value: string; span: Span }
export type BooleanLiteral = { type: "Literal"; kind: "boolean"; value: boolean; span: Span }
export type NullLiteral = { type: "Literal"; kind: "null"; value: null; span: Span }
export type DateLiteral = { type: "Literal"; kind: "date"; value: string; span: Span }
export type DurationLiteral = { type: "Literal"; kind: "duration"; value: string; span: Span }
export type RegexLiteral = {
type: "Literal"
kind: "regex"
value: string
flags: string
span: Span
}
export type Literal =
| NumberLiteral
| StringLiteral
| BooleanLiteral
| NullLiteral
| DateLiteral
| DurationLiteral
| RegexLiteral
export type Identifier = { type: "Identifier"; name: string; span: Span }
export type UnaryExpr = { type: "UnaryExpr"; operator: "!" | "-"; argument: Expr; span: Span }
export type BinaryExpr = {
type: "BinaryExpr"
operator: "+" | "-" | "*" | "/" | "%" | "==" | "!=" | ">" | ">=" | "<" | "<="
left: Expr
right: Expr
span: Span
}
export type LogicalExpr = {
type: "LogicalExpr"
operator: "&&" | "||"
left: Expr
right: Expr
span: Span
}
export type CallExpr = { type: "CallExpr"; callee: Expr; args: Expr[]; span: Span }
export type MemberExpr = { type: "MemberExpr"; object: Expr; property: string; span: Span }
export type IndexExpr = { type: "IndexExpr"; object: Expr; index: Expr; span: Span }
export type ListExpr = { type: "ListExpr"; elements: Expr[]; span: Span }
export type ErrorExpr = { type: "ErrorExpr"; message: string; span: Span }
export function spanFrom(start: Span, end: Span): Span {
return { start: start.start, end: end.end, file: start.file || end.file }
}

View File

@@ -1,9 +0,0 @@
import { Span } from "./ast"
export type BaseExpressionDiagnostic = {
kind: "lex" | "parse" | "runtime"
message: string
span: Span
context: string
source: string
}

View File

@@ -1,3 +0,0 @@
import { Span } from "./ast"
export type Diagnostic = { kind: "lex" | "parse"; message: string; span: Span }

View File

@@ -1,10 +0,0 @@
import { ProgramIR } from "./ir"
export type BasesExpressions = {
filters?: ProgramIR
viewFilters: Record<string, ProgramIR>
formulas: Record<string, ProgramIR>
summaries: Record<string, ProgramIR>
viewSummaries: Record<string, Record<string, ProgramIR>>
propertyExpressions: Record<string, ProgramIR>
}

View File

@@ -1,44 +0,0 @@
export { lex } from "./lexer"
export { parseExpressionSource } from "./parser"
export type { ParseResult } from "./parser"
export type { Diagnostic } from "./errors"
export type { Program, Expr, Span, Position } from "./ast"
export type { BaseExpressionDiagnostic } from "./diagnostics"
export type { BasesExpressions } from "./expressions"
export type { Instruction, ProgramIR } from "./ir"
export { compileExpression } from "./ir"
export { buildPropertyExpressionSource } from "./properties"
export type {
SummaryDefinition,
ViewSummaryConfig,
PropertyConfig,
BuiltinSummaryType,
} from "./schema"
export { BUILTIN_SUMMARY_TYPES } from "./schema"
export {
evaluateExpression,
evaluateFilterExpression,
evaluateSummaryExpression,
valueToUnknown,
} from "./interpreter"
export type {
EvalContext,
Value,
NullValue,
BooleanValue,
NumberValue,
StringValue,
DateValue,
DurationValue,
ListValue,
ObjectValue,
FileValue,
LinkValue,
RegexValue,
HtmlValue,
IconValue,
ImageValue,
ValueKind,
ValueOf,
} from "./interpreter"
export { isValueKind } from "./interpreter"

View File

@@ -1,73 +0,0 @@
import assert from "node:assert"
import test from "node:test"
import { FilePath, FullSlug, SimpleSlug } from "../../path"
type ContentLayout = "default" | "article" | "page"
import { evaluateExpression, valueToUnknown, EvalContext } from "./interpreter"
import { compileExpression } from "./ir"
import { parseExpressionSource } from "./parser"
const parseExpr = (source: string) => {
const result = parseExpressionSource(source, "test")
if (!result.program.body) {
throw new Error(`expected expression for ${source}`)
}
return compileExpression(result.program.body)
}
const makeCtx = (): EvalContext => {
const fileA = {
slug: "a" as FullSlug,
filePath: "a.md" as FilePath,
frontmatter: { title: "A", pageLayout: "default" as ContentLayout },
links: [] as SimpleSlug[],
}
const fileB = {
slug: "b" as FullSlug,
filePath: "b.md" as FilePath,
frontmatter: { title: "B", pageLayout: "default" as ContentLayout },
links: ["a"] as SimpleSlug[],
}
return { file: fileA, allFiles: [fileA, fileB] }
}
test("link equality resolves to file targets", () => {
const expr = parseExpr('link("a") == file("a")')
const value = valueToUnknown(evaluateExpression(expr, makeCtx()))
assert.strictEqual(value, true)
})
test("link equality matches raw string targets", () => {
const expr = parseExpr('link("a") == "a"')
const value = valueToUnknown(evaluateExpression(expr, makeCtx()))
assert.strictEqual(value, true)
})
test("date arithmetic handles month additions", () => {
const expr = parseExpr('date("2025-01-01") + "1M"')
const value = valueToUnknown(evaluateExpression(expr, makeCtx()))
assert.ok(value instanceof Date)
assert.strictEqual(value.toISOString().split("T")[0], "2025-02-01")
})
test("date subtraction returns duration in ms", () => {
const expr = parseExpr('date("2025-01-02") - date("2025-01-01")')
const value = valueToUnknown(evaluateExpression(expr, makeCtx()))
assert.strictEqual(value, 86400000)
})
test("list summary helpers compute statistics", () => {
const meanExpr = parseExpr("([1, 2, 3]).mean()")
const medianExpr = parseExpr("([1, 2, 3]).median()")
const stddevExpr = parseExpr("([1, 2, 3]).stddev()")
const sumExpr = parseExpr("([1, 2, 3]).sum()")
const ctx = makeCtx()
assert.strictEqual(valueToUnknown(evaluateExpression(meanExpr, ctx)), 2)
assert.strictEqual(valueToUnknown(evaluateExpression(medianExpr, ctx)), 2)
assert.strictEqual(valueToUnknown(evaluateExpression(sumExpr, ctx)), 6)
const stddev = valueToUnknown(evaluateExpression(stddevExpr, ctx))
assert.strictEqual(typeof stddev, "number")
if (typeof stddev === "number") {
assert.ok(Math.abs(stddev - Math.sqrt(2 / 3)) < 1e-6)
}
})

File diff suppressed because it is too large Load Diff

View File

@@ -1,164 +0,0 @@
import { BinaryExpr, Expr, Literal, Span, UnaryExpr } from "./ast"
export type JumpInstruction = {
op: "jump" | "jump_if_false" | "jump_if_true"
target: number
span: Span
}
export type Instruction =
| { op: "const"; literal: Literal; span: Span }
| { op: "ident"; name: string; span: Span }
| { op: "load_formula"; name: string; span: Span }
| { op: "load_formula_index"; span: Span }
| { op: "member"; property: string; span: Span }
| { op: "index"; span: Span }
| { op: "list"; count: number; span: Span }
| { op: "unary"; operator: UnaryExpr["operator"]; span: Span }
| { op: "binary"; operator: BinaryExpr["operator"]; span: Span }
| { op: "to_bool"; span: Span }
| { op: "call_global"; name: string; argc: number; span: Span }
| { op: "call_method"; name: string; argc: number; span: Span }
| { op: "call_dynamic"; span: Span }
| { op: "filter"; program: ProgramIR | null; span: Span }
| { op: "map"; program: ProgramIR | null; span: Span }
| { op: "reduce"; program: ProgramIR | null; initial: ProgramIR | null; span: Span }
| JumpInstruction
export type ProgramIR = { instructions: Instruction[]; span: Span }
const compileExpr = (expr: Expr, out: Instruction[]) => {
switch (expr.type) {
case "Literal":
out.push({ op: "const", literal: expr, span: expr.span })
return
case "Identifier":
out.push({ op: "ident", name: expr.name, span: expr.span })
return
case "UnaryExpr":
compileExpr(expr.argument, out)
out.push({ op: "unary", operator: expr.operator, span: expr.span })
return
case "BinaryExpr":
compileExpr(expr.left, out)
compileExpr(expr.right, out)
out.push({ op: "binary", operator: expr.operator, span: expr.span })
return
case "LogicalExpr": {
if (expr.operator === "&&") {
compileExpr(expr.left, out)
const jumpFalse: JumpInstruction = { op: "jump_if_false", target: -1, span: expr.span }
out.push(jumpFalse)
compileExpr(expr.right, out)
out.push({ op: "to_bool", span: expr.span })
const jumpEnd: JumpInstruction = { op: "jump", target: -1, span: expr.span }
out.push(jumpEnd)
const falseTarget = out.length
jumpFalse.target = falseTarget
out.push({
op: "const",
literal: { type: "Literal", kind: "boolean", value: false, span: expr.span },
span: expr.span,
})
jumpEnd.target = out.length
return
}
compileExpr(expr.left, out)
const jumpTrue: JumpInstruction = { op: "jump_if_true", target: -1, span: expr.span }
out.push(jumpTrue)
compileExpr(expr.right, out)
out.push({ op: "to_bool", span: expr.span })
const jumpEnd: JumpInstruction = { op: "jump", target: -1, span: expr.span }
out.push(jumpEnd)
const trueTarget = out.length
jumpTrue.target = trueTarget
out.push({
op: "const",
literal: { type: "Literal", kind: "boolean", value: true, span: expr.span },
span: expr.span,
})
jumpEnd.target = out.length
return
}
case "MemberExpr":
if (expr.object.type === "Identifier" && expr.object.name === "formula") {
out.push({ op: "load_formula", name: expr.property, span: expr.span })
return
}
compileExpr(expr.object, out)
out.push({ op: "member", property: expr.property, span: expr.span })
return
case "IndexExpr":
if (expr.object.type === "Identifier" && expr.object.name === "formula") {
compileExpr(expr.index, out)
out.push({ op: "load_formula_index", span: expr.span })
return
}
compileExpr(expr.object, out)
compileExpr(expr.index, out)
out.push({ op: "index", span: expr.span })
return
case "ListExpr":
for (const element of expr.elements) {
compileExpr(element, out)
}
out.push({ op: "list", count: expr.elements.length, span: expr.span })
return
case "CallExpr": {
if (expr.callee.type === "Identifier") {
for (const arg of expr.args) {
compileExpr(arg, out)
}
out.push({
op: "call_global",
name: expr.callee.name,
argc: expr.args.length,
span: expr.span,
})
return
}
if (expr.callee.type === "MemberExpr") {
const method = expr.callee.property
if (method === "filter" || method === "map" || method === "reduce") {
compileExpr(expr.callee.object, out)
const exprArg = expr.args[0]
const program = exprArg ? compileExpression(exprArg) : null
if (method === "filter") {
out.push({ op: "filter", program, span: expr.span })
return
}
if (method === "map") {
out.push({ op: "map", program, span: expr.span })
return
}
const initialArg = expr.args[1]
const initial = initialArg ? compileExpression(initialArg) : null
out.push({ op: "reduce", program, initial, span: expr.span })
return
}
compileExpr(expr.callee.object, out)
for (const arg of expr.args) {
compileExpr(arg, out)
}
out.push({ op: "call_method", name: method, argc: expr.args.length, span: expr.span })
return
}
compileExpr(expr.callee, out)
out.push({ op: "call_dynamic", span: expr.span })
return
}
case "ErrorExpr":
out.push({
op: "const",
literal: { type: "Literal", kind: "null", value: null, span: expr.span },
span: expr.span,
})
return
}
}
export const compileExpression = (expr: Expr): ProgramIR => {
const instructions: Instruction[] = []
compileExpr(expr, instructions)
return { instructions, span: expr.span }
}

View File

@@ -1,53 +0,0 @@
import assert from "node:assert"
import test from "node:test"
import { lex } from "./lexer"
test("lexes bracket access with hyphenated keys", () => {
const result = lex('note["my-field"]')
const types = result.tokens.map((token) => token.type)
assert.deepStrictEqual(types, ["identifier", "punctuation", "string", "punctuation", "eof"])
const value = result.tokens[2]
if (value.type !== "string") {
throw new Error("expected string token")
}
assert.strictEqual(value.value, "my-field")
})
test("lexes bracket access with escaped quotes", () => {
const result = lex('note["my\\\"field"]')
const value = result.tokens.find((token) => token.type === "string")
if (!value || value.type !== "string") {
throw new Error("expected string token")
}
assert.strictEqual(value.value, 'my"field')
})
test("lexes regex literals with flags", () => {
const result = lex('name.replace(/:/g, "-")')
const regexToken = result.tokens.find((token) => token.type === "regex")
if (!regexToken || regexToken.type !== "regex") {
throw new Error("expected regex token")
}
assert.strictEqual(regexToken.pattern, ":")
assert.strictEqual(regexToken.flags, "g")
})
test("lexes regex literals with escaped slashes", () => {
const result = lex("path.matches(/\\//)")
const regexToken = result.tokens.find((token) => token.type === "regex")
if (!regexToken || regexToken.type !== "regex") {
throw new Error("expected regex token")
}
assert.strictEqual(regexToken.pattern, "\\/")
assert.strictEqual(regexToken.flags, "")
})
test("lexes division as operator, not regex", () => {
const result = lex("a / b")
const operatorToken = result.tokens.find(
(token) => token.type === "operator" && token.value === "/",
)
assert.ok(operatorToken)
const regexToken = result.tokens.find((token) => token.type === "regex")
assert.strictEqual(regexToken, undefined)
})

View File

@@ -1,300 +0,0 @@
import { Position, Span } from "./ast"
import { Diagnostic } from "./errors"
import {
Operator,
Punctuation,
Token,
StringToken,
RegexToken,
NumberToken,
BooleanToken,
NullToken,
ThisToken,
IdentifierToken,
OperatorToken,
PunctuationToken,
EofToken,
} from "./tokens"
type LexResult = { tokens: Token[]; diagnostics: Diagnostic[] }
const operatorTokens: Operator[] = [
"==",
"!=",
">=",
"<=",
"&&",
"||",
"+",
"-",
"*",
"/",
"%",
"!",
">",
"<",
]
const punctuationTokens: Punctuation[] = [".", ",", "(", ")", "[", "]"]
const isOperator = (value: string): value is Operator =>
operatorTokens.some((token) => token === value)
const isPunctuation = (value: string): value is Punctuation =>
punctuationTokens.some((token) => token === value)
export function lex(input: string, file?: string): LexResult {
const tokens: Token[] = []
const diagnostics: Diagnostic[] = []
let index = 0
let line = 1
let column = 1
let canStartRegex = true
const makePosition = (offset: number, lineValue: number, columnValue: number): Position => ({
offset,
line: lineValue,
column: columnValue,
})
const currentPosition = (): Position => makePosition(index, line, column)
const makeSpan = (start: Position, end: Position): Span => ({ start, end, file })
const advance = (): string => {
const ch = input[index]
index += 1
if (ch === "\n") {
line += 1
column = 1
} else {
column += 1
}
return ch
}
const peek = (offset = 0): string => input[index + offset] ?? ""
const addDiagnostic = (message: string, span: Span) => {
diagnostics.push({ kind: "lex", message, span })
}
const updateRegexState = (token: Token | null) => {
if (!token) {
canStartRegex = true
return
}
if (token.type === "operator") {
canStartRegex = true
return
}
if (token.type === "punctuation") {
canStartRegex = token.value === "(" || token.value === "[" || token.value === ","
return
}
canStartRegex = false
}
const isWhitespace = (ch: string) => ch === " " || ch === "\t" || ch === "\n" || ch === "\r"
const isDigit = (ch: string) => ch >= "0" && ch <= "9"
const isIdentStart = (ch: string) =>
(ch >= "a" && ch <= "z") || (ch >= "A" && ch <= "Z") || ch === "_"
const isIdentContinue = (ch: string) => isIdentStart(ch) || isDigit(ch)
while (index < input.length) {
const ch = peek()
if (isWhitespace(ch)) {
advance()
continue
}
const start = currentPosition()
if (ch === "=" && peek(1) !== "=") {
let offset = 1
while (isWhitespace(peek(offset))) {
offset += 1
}
if (peek(offset) === ">") {
advance()
for (let step = 1; step < offset; step += 1) {
advance()
}
if (peek() === ">") {
advance()
}
const end = currentPosition()
addDiagnostic(
"arrow functions are not supported, use list.filter(expression)",
makeSpan(start, end),
)
continue
}
}
if (ch === '"' || ch === "'") {
const quote = advance()
let value = ""
let closed = false
while (index < input.length) {
const curr = advance()
if (curr === quote) {
closed = true
break
}
if (curr === "\\") {
const next = advance()
if (next === "n") value += "\n"
else if (next === "t") value += "\t"
else if (next === "r") value += "\r"
else if (next === "\\" || next === "'" || next === '"') value += next
else value += next
} else {
value += curr
}
}
const end = currentPosition()
const span = makeSpan(start, end)
if (!closed) addDiagnostic("unterminated string literal", span)
const token: StringToken = { type: "string", value, span }
tokens.push(token)
updateRegexState(token)
continue
}
if (ch === "/" && canStartRegex) {
const next = peek(1)
if (next !== "/" && next !== "") {
advance()
let pattern = ""
let closed = false
let inClass = false
while (index < input.length) {
const curr = advance()
if (curr === "\\" && index < input.length) {
const escaped = advance()
pattern += `\\${escaped}`
continue
}
if (curr === "[" && !inClass) inClass = true
if (curr === "]" && inClass) inClass = false
if (curr === "/" && !inClass) {
closed = true
break
}
pattern += curr
}
let flags = ""
while (index < input.length) {
const flag = peek()
if (!/^[gimsuy]$/.test(flag)) break
flags += advance()
}
const end = currentPosition()
const span = makeSpan(start, end)
if (!closed) addDiagnostic("unterminated regex literal", span)
const token: RegexToken = { type: "regex", pattern, flags, span }
tokens.push(token)
updateRegexState(token)
continue
}
}
if (isDigit(ch)) {
let num = ""
while (index < input.length && isDigit(peek())) {
num += advance()
}
if (peek() === "." && isDigit(peek(1))) {
num += advance()
while (index < input.length && isDigit(peek())) {
num += advance()
}
}
const end = currentPosition()
const span = makeSpan(start, end)
const token: NumberToken = { type: "number", value: Number(num), span }
tokens.push(token)
updateRegexState(token)
continue
}
if (isIdentStart(ch)) {
let ident = ""
while (index < input.length && isIdentContinue(peek())) {
ident += advance()
}
const end = currentPosition()
const span = makeSpan(start, end)
if (ident === "true" || ident === "false") {
const token: BooleanToken = { type: "boolean", value: ident === "true", span }
tokens.push(token)
updateRegexState(token)
continue
}
if (ident === "null") {
const token: NullToken = { type: "null", span }
tokens.push(token)
updateRegexState(token)
continue
}
if (ident === "this") {
const token: ThisToken = { type: "this", span }
tokens.push(token)
updateRegexState(token)
continue
}
const token: IdentifierToken = { type: "identifier", value: ident, span }
tokens.push(token)
updateRegexState(token)
continue
}
const twoChar = ch + peek(1)
if (isOperator(twoChar)) {
advance()
advance()
const end = currentPosition()
const span = makeSpan(start, end)
const token: OperatorToken = { type: "operator", value: twoChar, span }
tokens.push(token)
updateRegexState(token)
continue
}
if (isOperator(ch)) {
advance()
const end = currentPosition()
const span = makeSpan(start, end)
const token: OperatorToken = { type: "operator", value: ch, span }
tokens.push(token)
updateRegexState(token)
continue
}
if (isPunctuation(ch)) {
advance()
const end = currentPosition()
const span = makeSpan(start, end)
const token: PunctuationToken = { type: "punctuation", value: ch, span }
tokens.push(token)
updateRegexState(token)
continue
}
advance()
const end = currentPosition()
addDiagnostic(`unexpected character: ${ch}`, makeSpan(start, end))
}
const eofPos = currentPosition()
const eofSpan = makeSpan(eofPos, eofPos)
const eofToken: EofToken = { type: "eof", span: eofSpan }
tokens.push(eofToken)
updateRegexState(eofToken)
return { tokens, diagnostics }
}

View File

@@ -1,261 +0,0 @@
import assert from "node:assert"
import test from "node:test"
import { parseExpressionSource } from "./parser"
const isRecord = (value: unknown): value is Record<string, unknown> =>
typeof value === "object" && value !== null
const strip = (node: unknown): unknown => {
if (!isRecord(node)) return node
const type = node.type
if (type === "Identifier") {
return { type, name: node.name }
}
if (type === "Literal") {
const kind = node.kind
const value = node.value
const flags = node.flags
return flags !== undefined ? { type, kind, value, flags } : { type, kind, value }
}
if (type === "UnaryExpr") {
return { type, operator: node.operator, argument: strip(node.argument) }
}
if (type === "BinaryExpr" || type === "LogicalExpr") {
return { type, operator: node.operator, left: strip(node.left), right: strip(node.right) }
}
if (type === "CallExpr") {
const args = Array.isArray(node.args) ? node.args.map(strip) : []
return { type, callee: strip(node.callee), args }
}
if (type === "MemberExpr") {
return { type, object: strip(node.object), property: node.property }
}
if (type === "IndexExpr") {
return { type, object: strip(node.object), index: strip(node.index) }
}
if (type === "ListExpr") {
const elements = Array.isArray(node.elements) ? node.elements.map(strip) : []
return { type, elements }
}
if (type === "ErrorExpr") {
return { type, message: node.message }
}
return node
}
test("ebnf to ast mapping snapshots", () => {
const cases: Array<{ source: string; expected: unknown }> = [
{
source: 'status == "done"',
expected: {
type: "BinaryExpr",
operator: "==",
left: { type: "Identifier", name: "status" },
right: { type: "Literal", kind: "string", value: "done" },
},
},
{
source: "!done",
expected: {
type: "UnaryExpr",
operator: "!",
argument: { type: "Identifier", name: "done" },
},
},
{
source: "file.ctime",
expected: {
type: "MemberExpr",
object: { type: "Identifier", name: "file" },
property: "ctime",
},
},
{
source: 'note["my-field"]',
expected: {
type: "IndexExpr",
object: { type: "Identifier", name: "note" },
index: { type: "Literal", kind: "string", value: "my-field" },
},
},
{
source: "date(due) < today()",
expected: {
type: "BinaryExpr",
operator: "<",
left: {
type: "CallExpr",
callee: { type: "Identifier", name: "date" },
args: [{ type: "Identifier", name: "due" }],
},
right: { type: "CallExpr", callee: { type: "Identifier", name: "today" }, args: [] },
},
},
{
source: "now() - file.ctime",
expected: {
type: "BinaryExpr",
operator: "-",
left: { type: "CallExpr", callee: { type: "Identifier", name: "now" }, args: [] },
right: {
type: "MemberExpr",
object: { type: "Identifier", name: "file" },
property: "ctime",
},
},
},
{
source: "(pages * 2).round(0)",
expected: {
type: "CallExpr",
callee: {
type: "MemberExpr",
object: {
type: "BinaryExpr",
operator: "*",
left: { type: "Identifier", name: "pages" },
right: { type: "Literal", kind: "number", value: 2 },
},
property: "round",
},
args: [{ type: "Literal", kind: "number", value: 0 }],
},
},
{
source: 'tags.containsAny("a","b")',
expected: {
type: "CallExpr",
callee: {
type: "MemberExpr",
object: { type: "Identifier", name: "tags" },
property: "containsAny",
},
args: [
{ type: "Literal", kind: "string", value: "a" },
{ type: "Literal", kind: "string", value: "b" },
],
},
},
{
source: "list(links).filter(value.isTruthy())",
expected: {
type: "CallExpr",
callee: {
type: "MemberExpr",
object: {
type: "CallExpr",
callee: { type: "Identifier", name: "list" },
args: [{ type: "Identifier", name: "links" }],
},
property: "filter",
},
args: [
{
type: "CallExpr",
callee: {
type: "MemberExpr",
object: { type: "Identifier", name: "value" },
property: "isTruthy",
},
args: [],
},
],
},
},
{
source: '["a", "b", "c"].length',
expected: {
type: "MemberExpr",
object: {
type: "ListExpr",
elements: [
{ type: "Literal", kind: "string", value: "a" },
{ type: "Literal", kind: "string", value: "b" },
{ type: "Literal", kind: "string", value: "c" },
],
},
property: "length",
},
},
{
source: "this.file.name",
expected: {
type: "MemberExpr",
object: {
type: "MemberExpr",
object: { type: "Identifier", name: "this" },
property: "file",
},
property: "name",
},
},
{
source: "a || b && c",
expected: {
type: "LogicalExpr",
operator: "||",
left: { type: "Identifier", name: "a" },
right: {
type: "LogicalExpr",
operator: "&&",
left: { type: "Identifier", name: "b" },
right: { type: "Identifier", name: "c" },
},
},
},
{
source: "values[0]",
expected: {
type: "IndexExpr",
object: { type: "Identifier", name: "values" },
index: { type: "Literal", kind: "number", value: 0 },
},
},
]
for (const entry of cases) {
const result = parseExpressionSource(entry.source)
assert.strictEqual(result.diagnostics.length, 0)
assert.deepStrictEqual(strip(result.program.body), entry.expected)
}
})
test("syntax doc samples parse", () => {
const samples = [
'note["price"]',
"file.size > 10",
"file.hasLink(this.file)",
'date("2024-12-01") + "1M" + "4h" + "3m"',
"now() - file.ctime",
"property[0]",
'link("filename", icon("plus"))',
'file.mtime > now() - "1 week"',
'/abc/.matches("abcde")',
'name.replace(/:/g, "-")',
'values.filter(value.isType("number")).reduce(if(acc == null || value > acc, value, acc), null)',
]
for (const source of samples) {
const result = parseExpressionSource(source)
assert.strictEqual(result.diagnostics.length, 0)
assert.ok(result.program.body)
}
})
test("string escapes are decoded", () => {
const result = parseExpressionSource('"a\\n\\"b"')
assert.strictEqual(result.diagnostics.length, 0)
const literal = strip(result.program.body)
if (!isRecord(literal)) {
throw new Error("expected literal record")
}
assert.strictEqual(literal.type, "Literal")
assert.strictEqual(literal.kind, "string")
assert.strictEqual(literal.value, 'a\n"b')
})
test("parser reports errors and recovers", () => {
const result = parseExpressionSource("status ==")
assert.ok(result.diagnostics.length > 0)
assert.ok(result.program.body)
})

View File

@@ -1,370 +0,0 @@
import {
BinaryExpr,
CallExpr,
ErrorExpr,
Expr,
Identifier,
IndexExpr,
ListExpr,
Literal,
LogicalExpr,
MemberExpr,
Program,
UnaryExpr,
spanFrom,
} from "./ast"
import { Diagnostic } from "./errors"
import { lex } from "./lexer"
import { Operator, Token } from "./tokens"
export type ParseResult = { program: Program; tokens: Token[]; diagnostics: Diagnostic[] }
type InfixInfo = { lbp: number; rbp: number; kind: "binary" | "logical" }
const infixBindingPowers: Record<string, InfixInfo> = {
"||": { lbp: 1, rbp: 2, kind: "logical" },
"&&": { lbp: 3, rbp: 4, kind: "logical" },
"==": { lbp: 5, rbp: 6, kind: "binary" },
"!=": { lbp: 5, rbp: 6, kind: "binary" },
">": { lbp: 7, rbp: 8, kind: "binary" },
">=": { lbp: 7, rbp: 8, kind: "binary" },
"<": { lbp: 7, rbp: 8, kind: "binary" },
"<=": { lbp: 7, rbp: 8, kind: "binary" },
"+": { lbp: 9, rbp: 10, kind: "binary" },
"-": { lbp: 9, rbp: 10, kind: "binary" },
"*": { lbp: 11, rbp: 12, kind: "binary" },
"/": { lbp: 11, rbp: 12, kind: "binary" },
"%": { lbp: 11, rbp: 12, kind: "binary" },
}
const isLogicalOperator = (value: Operator): value is LogicalExpr["operator"] =>
value === "&&" || value === "||"
const isBinaryOperator = (value: Operator): value is BinaryExpr["operator"] =>
value === "+" ||
value === "-" ||
value === "*" ||
value === "/" ||
value === "%" ||
value === "==" ||
value === "!=" ||
value === ">" ||
value === ">=" ||
value === "<" ||
value === "<="
export function parseExpressionSource(source: string, file?: string): ParseResult {
const { tokens, diagnostics } = lex(source, file)
const parser = new Parser(tokens, diagnostics)
const program = parser.parseProgram()
return { program, tokens, diagnostics }
}
class Parser {
private tokens: Token[]
private index: number
private diagnostics: Diagnostic[]
constructor(tokens: Token[], diagnostics: Diagnostic[]) {
this.tokens = tokens
this.index = 0
this.diagnostics = diagnostics
}
parseProgram(): Program {
const start = this.tokens[0]?.span ?? this.tokens[this.tokens.length - 1].span
const body = this.peek().type === "eof" ? null : this.parseExpression(0)
const end = this.tokens[this.tokens.length - 1]?.span ?? start
return { type: "Program", body, span: spanFrom(start, end) }
}
private parseExpression(minBp: number): Expr {
let left = this.parsePrefix()
left = this.parsePostfix(left)
while (true) {
const token = this.peek()
if (token.type !== "operator") break
const info = infixBindingPowers[token.value]
if (!info || info.lbp < minBp) break
this.advance()
const right = this.parseExpression(info.rbp)
const span = spanFrom(left.span, right.span)
if (info.kind === "logical" && isLogicalOperator(token.value)) {
left = { type: "LogicalExpr", operator: token.value, left, right, span }
} else if (info.kind === "binary" && isBinaryOperator(token.value)) {
left = { type: "BinaryExpr", operator: token.value, left, right, span }
} else {
this.error("unexpected operator", token.span)
}
}
return left
}
private parsePrefix(): Expr {
const token = this.peek()
if (token.type === "operator" && (token.value === "!" || token.value === "-")) {
this.advance()
const argument = this.parseExpression(13)
const span = spanFrom(token.span, argument.span)
const node: UnaryExpr = { type: "UnaryExpr", operator: token.value, argument, span }
return node
}
return this.parsePrimary()
}
private parsePostfix(expr: Expr): Expr {
let current = expr
while (true) {
const token = this.peek()
if (token.type === "punctuation" && token.value === ".") {
this.advance()
const propToken = this.peek()
if (propToken.type !== "identifier") {
this.error("expected identifier after '.'", propToken.span)
return current
}
this.advance()
const span = spanFrom(current.span, propToken.span)
const node: MemberExpr = {
type: "MemberExpr",
object: current,
property: propToken.value,
span,
}
current = node
continue
}
if (token.type === "punctuation" && token.value === "[") {
this.advance()
const indexExpr = this.parseExpression(0)
const endToken = this.peek()
if (!(endToken.type === "punctuation" && endToken.value === "]")) {
this.error("expected ']'", endToken.span)
this.syncTo("]")
} else {
this.advance()
}
const span = spanFrom(current.span, endToken.span)
const node: IndexExpr = { type: "IndexExpr", object: current, index: indexExpr, span }
current = node
continue
}
if (token.type === "punctuation" && token.value === "(") {
this.advance()
const args: Expr[] = []
while (this.peek().type !== "eof") {
const next = this.peek()
if (next.type === "punctuation" && next.value === ")") {
this.advance()
break
}
const arg = this.parseExpression(0)
args.push(arg)
const sep = this.peek()
if (sep.type === "punctuation" && sep.value === ",") {
this.advance()
const maybeClose = this.peek()
if (maybeClose.type === "punctuation" && maybeClose.value === ")") {
this.advance()
break
}
continue
}
if (sep.type === "punctuation" && sep.value === ")") {
this.advance()
break
}
this.error("expected ',' or ')'", sep.span)
this.syncTo(")")
const maybeClose = this.peek()
if (maybeClose.type === "punctuation" && maybeClose.value === ")") {
this.advance()
}
break
}
const endToken = this.previous()
const span = spanFrom(current.span, endToken.span)
const node: CallExpr = { type: "CallExpr", callee: current, args, span }
current = node
continue
}
break
}
return current
}
private parsePrimary(): Expr {
const token = this.peek()
if (token.type === "number") {
this.advance()
const node: Literal = {
type: "Literal",
kind: "number",
value: token.value,
span: token.span,
}
return node
}
if (token.type === "string") {
this.advance()
const node: Literal = {
type: "Literal",
kind: "string",
value: token.value,
span: token.span,
}
return node
}
if (token.type === "boolean") {
this.advance()
const node: Literal = {
type: "Literal",
kind: "boolean",
value: token.value,
span: token.span,
}
return node
}
if (token.type === "null") {
this.advance()
const node: Literal = { type: "Literal", kind: "null", value: null, span: token.span }
return node
}
if (token.type === "regex") {
this.advance()
const node: Literal = {
type: "Literal",
kind: "regex",
value: token.pattern,
flags: token.flags,
span: token.span,
}
return node
}
if (token.type === "identifier") {
this.advance()
const node: Identifier = { type: "Identifier", name: token.value, span: token.span }
return node
}
if (token.type === "this") {
this.advance()
const node: Identifier = { type: "Identifier", name: "this", span: token.span }
return node
}
if (token.type === "punctuation" && token.value === "(") {
this.advance()
const expr = this.parseExpression(0)
const closeToken = this.peek()
if (closeToken.type === "punctuation" && closeToken.value === ")") {
this.advance()
} else {
this.error("expected ')'", closeToken.span)
this.syncTo(")")
const maybeClose = this.peek()
if (maybeClose.type === "punctuation" && maybeClose.value === ")") {
this.advance()
}
}
return expr
}
if (token.type === "punctuation" && token.value === "[") {
return this.parseList()
}
this.error("unexpected token", token.span)
this.advance()
const node: ErrorExpr = { type: "ErrorExpr", message: "unexpected token", span: token.span }
return node
}
private parseList(): Expr {
const startToken = this.peek()
this.advance()
const elements: Expr[] = []
while (this.peek().type !== "eof") {
const next = this.peek()
if (next.type === "punctuation" && next.value === "]") {
this.advance()
const span = spanFrom(startToken.span, next.span)
const node: ListExpr = { type: "ListExpr", elements, span }
return node
}
const element = this.parseExpression(0)
elements.push(element)
const sep = this.peek()
if (sep.type === "punctuation" && sep.value === ",") {
this.advance()
const maybeClose = this.peek()
if (maybeClose.type === "punctuation" && maybeClose.value === "]") {
this.advance()
const span = spanFrom(startToken.span, maybeClose.span)
const node: ListExpr = { type: "ListExpr", elements, span }
return node
}
continue
}
if (sep.type === "punctuation" && sep.value === "]") {
this.advance()
const span = spanFrom(startToken.span, sep.span)
const node: ListExpr = { type: "ListExpr", elements, span }
return node
}
this.error("expected ',' or ']'", sep.span)
this.syncTo("]")
const maybeClose = this.peek()
if (maybeClose.type === "punctuation" && maybeClose.value === "]") {
const endToken = maybeClose
this.advance()
const span = spanFrom(startToken.span, endToken.span)
const node: ListExpr = { type: "ListExpr", elements, span }
return node
}
break
}
const endToken = this.previous()
const span = spanFrom(startToken.span, endToken.span)
return { type: "ListExpr", elements, span }
}
private error(message: string, span: Token["span"]) {
this.diagnostics.push({ kind: "parse", message, span })
}
private syncTo(value: ")" | "]") {
while (this.peek().type !== "eof") {
const token = this.peek()
if (token.type === "punctuation" && token.value === value) {
return
}
this.advance()
}
}
private peek(): Token {
return this.tokens[this.index]
}
private previous(): Token {
return this.tokens[Math.max(0, this.index - 1)]
}
private advance(): Token {
const token = this.tokens[this.index]
if (this.index < this.tokens.length - 1) this.index += 1
return token
}
}

View File

@@ -1,27 +0,0 @@
import assert from "node:assert"
import test from "node:test"
import { parseExpressionSource } from "./parser"
import { buildPropertyExpressionSource } from "./properties"
test("builds property expression sources", () => {
const cases: Array<{ input: string; expected: string }> = [
{ input: "status", expected: "note.status" },
{ input: "note.status", expected: "note.status" },
{ input: "file.name", expected: "file.name" },
{ input: "file.my-field", expected: 'file["my-field"]' },
{ input: "my-field", expected: 'note["my-field"]' },
{ input: 'note["my field"]', expected: 'note["my field"]' },
{ input: "formula.total", expected: "formula.total" },
{ input: "this.file.name", expected: "this.file.name" },
{ input: "a.b-c.d", expected: 'note.a["b-c"].d' },
{ input: "date(file.ctime)", expected: "date(file.ctime)" },
]
for (const entry of cases) {
const result = buildPropertyExpressionSource(entry.input)
assert.strictEqual(result, entry.expected)
const parsed = parseExpressionSource(entry.expected)
assert.strictEqual(parsed.diagnostics.length, 0)
assert.ok(parsed.program.body)
}
})

View File

@@ -1,27 +0,0 @@
const simpleIdentifierPattern = /^[A-Za-z_][A-Za-z0-9_]*$/
export function buildPropertyExpressionSource(property: string): string | null {
const trimmed = property.trim()
if (!trimmed) return null
if (trimmed.includes("(") || trimmed.includes("[") || trimmed.includes("]")) {
return trimmed
}
const parts = trimmed.split(".")
const root = parts[0]
const rest = parts.slice(1)
const buildAccess = (base: string, segments: string[]) => {
let source = base
for (const segment of segments) {
if (simpleIdentifierPattern.test(segment)) {
source = `${source}.${segment}`
} else {
source = `${source}[${JSON.stringify(segment)}]`
}
}
return source
}
if (root === "file" || root === "note" || root === "formula" || root === "this") {
return buildAccess(root, rest)
}
return buildAccess("note", parts)
}

View File

@@ -1,36 +0,0 @@
export const BUILTIN_SUMMARY_TYPES = [
"count",
"sum",
"average",
"avg",
"min",
"max",
"range",
"unique",
"filled",
"missing",
"median",
"stddev",
"checked",
"unchecked",
"empty",
"earliest",
"latest",
] as const
export type BuiltinSummaryType = (typeof BUILTIN_SUMMARY_TYPES)[number]
export interface SummaryDefinition {
type: "builtin" | "formula"
builtinType?: BuiltinSummaryType
formulaRef?: string
expression?: string
}
export interface ViewSummaryConfig {
columns: Record<string, SummaryDefinition>
}
export interface PropertyConfig {
displayName?: string
}

View File

@@ -1,42 +0,0 @@
import { Span } from "./ast"
export type Operator =
| "=="
| "!="
| ">="
| "<="
| ">"
| "<"
| "&&"
| "||"
| "+"
| "-"
| "*"
| "/"
| "%"
| "!"
export type Punctuation = "." | "," | "(" | ")" | "[" | "]"
export type NumberToken = { type: "number"; value: number; span: Span }
export type StringToken = { type: "string"; value: string; span: Span }
export type BooleanToken = { type: "boolean"; value: boolean; span: Span }
export type NullToken = { type: "null"; span: Span }
export type IdentifierToken = { type: "identifier"; value: string; span: Span }
export type ThisToken = { type: "this"; span: Span }
export type OperatorToken = { type: "operator"; value: Operator; span: Span }
export type PunctuationToken = { type: "punctuation"; value: Punctuation; span: Span }
export type RegexToken = { type: "regex"; pattern: string; flags: string; span: Span }
export type EofToken = { type: "eof"; span: Span }
export type Token =
| NumberToken
| StringToken
| BooleanToken
| NullToken
| IdentifierToken
| ThisToken
| OperatorToken
| PunctuationToken
| RegexToken
| EofToken

View File

@@ -1,278 +0,0 @@
import yaml from "js-yaml"
import fs from "node:fs/promises"
import path from "node:path"
import {
parseExpressionSource,
compileExpression,
buildPropertyExpressionSource,
BUILTIN_SUMMARY_TYPES,
} from "./compiler"
import { Expr, LogicalExpr, UnaryExpr, spanFrom } from "./compiler/ast"
import { Diagnostic } from "./compiler/errors"
const isRecord = (value: unknown): value is Record<string, unknown> =>
typeof value === "object" && value !== null && !Array.isArray(value)
type CollectedExpression = {
kind: string
context: string
source: string
ast: Expr | null
ir: unknown
diagnostics: Diagnostic[]
}
const parseToExpr = (source: string, filePath: string) => {
const result = parseExpressionSource(source, filePath)
return { expr: result.program.body ?? null, diagnostics: result.diagnostics }
}
const buildLogical = (operator: "&&" | "||", expressionsList: Expr[]): Expr | null => {
if (expressionsList.length === 0) return null
let current: Expr | null = null
for (const next of expressionsList) {
if (!current) {
current = next
continue
}
const span = spanFrom(current.span, next.span)
const node: LogicalExpr = { type: "LogicalExpr", operator, left: current, right: next, span }
current = node
}
return current
}
const negateExpressions = (expressionsList: Expr[]): Expr[] =>
expressionsList.map((expr) => {
const node: UnaryExpr = {
type: "UnaryExpr",
operator: "!",
argument: expr,
span: spanFrom(expr.span, expr.span),
}
return node
})
const buildFilterExpr = (
raw: unknown,
context: string,
diagnostics: Diagnostic[],
filePath: string,
): Expr | null => {
if (typeof raw === "string") {
const parsed = parseToExpr(raw, filePath)
diagnostics.push(...parsed.diagnostics)
return parsed.expr
}
if (!isRecord(raw)) return null
if (Array.isArray(raw.and)) {
const parts = raw.and
.map((entry, index) =>
buildFilterExpr(entry, `${context}.and[${index}]`, diagnostics, filePath),
)
.filter((entry): entry is Expr => Boolean(entry))
return buildLogical("&&", parts)
}
if (Array.isArray(raw.or)) {
const parts = raw.or
.map((entry, index) =>
buildFilterExpr(entry, `${context}.or[${index}]`, diagnostics, filePath),
)
.filter((entry): entry is Expr => Boolean(entry))
return buildLogical("||", parts)
}
if (Array.isArray(raw.not)) {
const parts = raw.not
.map((entry, index) =>
buildFilterExpr(entry, `${context}.not[${index}]`, diagnostics, filePath),
)
.filter((entry): entry is Expr => Boolean(entry))
return buildLogical("&&", negateExpressions(parts))
}
return null
}
const collectPropertyExpressions = (
views: unknown[],
): Map<string, { source: string; context: string }> => {
const entries = new Map<string, { source: string; context: string }>()
const addProperty = (property: string, context: string) => {
const key = property.trim()
if (!key || entries.has(key)) return
const source = buildPropertyExpressionSource(key)
if (!source) return
entries.set(key, { source, context })
}
views.forEach((view, viewIndex) => {
if (!isRecord(view)) return
const viewContext = `views[${viewIndex}]`
if (Array.isArray(view.order)) {
view.order.forEach((entry, orderIndex) => {
if (typeof entry === "string") {
addProperty(entry, `${viewContext}.order[${orderIndex}]`)
}
})
}
if (Array.isArray(view.sort)) {
view.sort.forEach((entry, sortIndex) => {
if (isRecord(entry) && typeof entry.property === "string") {
addProperty(entry.property, `${viewContext}.sort[${sortIndex}].property`)
}
})
}
if (typeof view.groupBy === "string") {
addProperty(view.groupBy, `${viewContext}.groupBy`)
} else if (isRecord(view.groupBy) && typeof view.groupBy.property === "string") {
addProperty(view.groupBy.property, `${viewContext}.groupBy.property`)
}
if (view.summaries && isRecord(view.summaries)) {
const columns =
"columns" in view.summaries && isRecord(view.summaries.columns)
? view.summaries.columns
: view.summaries
for (const key of Object.keys(columns)) {
addProperty(key, `${viewContext}.summaries.${key}`)
}
}
if (typeof view.image === "string") {
addProperty(view.image, `${viewContext}.image`)
}
if (view.type === "map") {
const coords = typeof view.coordinates === "string" ? view.coordinates : "coordinates"
addProperty(coords, `${viewContext}.coordinates`)
if (typeof view.markerIcon === "string") {
addProperty(view.markerIcon, `${viewContext}.markerIcon`)
}
if (typeof view.markerColor === "string") {
addProperty(view.markerColor, `${viewContext}.markerColor`)
}
}
})
return entries
}
const main = async () => {
const inputPath = process.argv[2] ? String(process.argv[2]) : "content/antilibrary.base"
const filePath = path.resolve(process.cwd(), inputPath)
const raw = await fs.readFile(filePath, "utf8")
const parsed = yaml.load(raw)
const config = isRecord(parsed) ? parsed : {}
const collected: CollectedExpression[] = []
if (config.filters !== undefined) {
const diagnostics: Diagnostic[] = []
const expr = buildFilterExpr(config.filters, "filters", diagnostics, filePath)
collected.push({
kind: "filters",
context: "filters",
source: typeof config.filters === "string" ? config.filters : JSON.stringify(config.filters),
ast: expr,
ir: expr ? compileExpression(expr) : null,
diagnostics,
})
}
if (isRecord(config.formulas)) {
for (const [name, value] of Object.entries(config.formulas)) {
if (typeof value !== "string") continue
const parsedExpr = parseToExpr(value, filePath)
collected.push({
kind: "formula",
context: `formulas.${name}`,
source: value,
ast: parsedExpr.expr,
ir: parsedExpr.expr ? compileExpression(parsedExpr.expr) : null,
diagnostics: parsedExpr.diagnostics,
})
}
}
const topLevelSummaries = isRecord(config.summaries) ? config.summaries : {}
if (isRecord(config.summaries)) {
for (const [name, value] of Object.entries(config.summaries)) {
if (typeof value !== "string") continue
const parsedExpr = parseToExpr(value, filePath)
collected.push({
kind: "summary",
context: `summaries.${name}`,
source: value,
ast: parsedExpr.expr,
ir: parsedExpr.expr ? compileExpression(parsedExpr.expr) : null,
diagnostics: parsedExpr.diagnostics,
})
}
}
if (Array.isArray(config.views)) {
config.views.forEach((view, index) => {
if (!isRecord(view)) return
if (view.filters !== undefined) {
const diagnostics: Diagnostic[] = []
const expr = buildFilterExpr(view.filters, `views[${index}].filters`, diagnostics, filePath)
collected.push({
kind: "view.filter",
context: `views[${index}].filters`,
source: typeof view.filters === "string" ? view.filters : JSON.stringify(view.filters),
ast: expr,
ir: expr ? compileExpression(expr) : null,
diagnostics,
})
}
if (view.summaries && isRecord(view.summaries)) {
const columns =
"columns" in view.summaries && isRecord(view.summaries.columns)
? view.summaries.columns
: view.summaries
for (const [column, summaryValue] of Object.entries(columns)) {
if (typeof summaryValue !== "string") continue
const normalized = summaryValue.toLowerCase().trim()
const builtins = new Set<string>(BUILTIN_SUMMARY_TYPES)
if (builtins.has(normalized)) continue
const summarySource =
summaryValue in topLevelSummaries && typeof topLevelSummaries[summaryValue] === "string"
? String(topLevelSummaries[summaryValue])
: summaryValue
const parsedExpr = parseToExpr(summarySource, filePath)
collected.push({
kind: "view.summary",
context: `views[${index}].summaries.${column}`,
source: summarySource,
ast: parsedExpr.expr,
ir: parsedExpr.expr ? compileExpression(parsedExpr.expr) : null,
diagnostics: parsedExpr.diagnostics,
})
}
}
})
}
const views = Array.isArray(config.views) ? config.views : []
const propertyExpressions = collectPropertyExpressions(views)
for (const [_, entry] of propertyExpressions.entries()) {
const parsedExpr = parseToExpr(entry.source, filePath)
collected.push({
kind: "property",
context: entry.context,
source: entry.source,
ast: parsedExpr.expr,
ir: parsedExpr.expr ? compileExpression(parsedExpr.expr) : null,
diagnostics: parsedExpr.diagnostics,
})
}
const payload = { file: inputPath, count: collected.length, expressions: collected }
process.stdout.write(JSON.stringify(payload, null, 2))
}
main()

View File

@@ -1,248 +0,0 @@
import { QuartzPluginData } from "../../plugins/vfile"
import { evaluateSummaryExpression, valueToUnknown, EvalContext, ProgramIR } from "./compiler"
import { SummaryDefinition, ViewSummaryConfig, BuiltinSummaryType } from "./types"
type SummaryValueResolver = (
file: QuartzPluginData,
column: string,
allFiles: QuartzPluginData[],
) => unknown
type SummaryContextFactory = (file: QuartzPluginData) => EvalContext
export function computeColumnSummary(
column: string,
files: QuartzPluginData[],
summary: SummaryDefinition,
allFiles: QuartzPluginData[] = [],
valueResolver: SummaryValueResolver,
getContext: SummaryContextFactory,
summaryExpression?: ProgramIR,
): string | number | undefined {
if (files.length === 0) {
return undefined
}
const values = files.map((file) => valueResolver(file, column, allFiles))
if (summary.type === "builtin" && summary.builtinType) {
return computeBuiltinSummary(values, summary.builtinType)
}
if (summary.type === "formula" && summary.expression) {
if (summaryExpression) {
const summaryCtx = getContext(files[0])
summaryCtx.diagnosticContext = `summaries.${column}`
summaryCtx.diagnosticSource = summary.expression
summaryCtx.rows = files
const value = evaluateSummaryExpression(summaryExpression, values, summaryCtx)
const unknownValue = valueToUnknown(value)
if (typeof unknownValue === "number" || typeof unknownValue === "string") {
return unknownValue
}
return undefined
}
}
return undefined
}
function computeBuiltinSummary(
values: any[],
type: BuiltinSummaryType,
): string | number | undefined {
switch (type) {
case "count":
return values.length
case "sum": {
const nums = values.filter((v) => typeof v === "number")
if (nums.length === 0) return undefined
return nums.reduce((acc, v) => acc + v, 0)
}
case "average":
case "avg": {
const nums = values.filter((v) => typeof v === "number")
if (nums.length === 0) return undefined
const sum = nums.reduce((acc, v) => acc + v, 0)
return Math.round((sum / nums.length) * 100) / 100
}
case "min": {
const comparable = values.filter(
(v) => typeof v === "number" || v instanceof Date || typeof v === "string",
)
if (comparable.length === 0) return undefined
const normalized = comparable.map((v) => (v instanceof Date ? v.getTime() : v))
const min = Math.min(...normalized.filter((v) => typeof v === "number"))
if (isNaN(min)) {
const strings = comparable.filter((v) => typeof v === "string") as string[]
if (strings.length === 0) return undefined
return strings.sort()[0]
}
if (comparable.some((v) => v instanceof Date)) {
return new Date(min).toISOString().split("T")[0]
}
return min
}
case "max": {
const comparable = values.filter(
(v) => typeof v === "number" || v instanceof Date || typeof v === "string",
)
if (comparable.length === 0) return undefined
const normalized = comparable.map((v) => (v instanceof Date ? v.getTime() : v))
const max = Math.max(...normalized.filter((v) => typeof v === "number"))
if (isNaN(max)) {
const strings = comparable.filter((v) => typeof v === "string") as string[]
if (strings.length === 0) return undefined
return strings.sort().reverse()[0]
}
if (comparable.some((v) => v instanceof Date)) {
return new Date(max).toISOString().split("T")[0]
}
return max
}
case "range": {
const comparable = values.filter(
(v) => typeof v === "number" || v instanceof Date || typeof v === "string",
)
if (comparable.length === 0) return undefined
const normalized = comparable.map((v) => (v instanceof Date ? v.getTime() : v))
const nums = normalized.filter((v) => typeof v === "number")
if (nums.length === 0) return undefined
const min = Math.min(...nums)
const max = Math.max(...nums)
if (comparable.some((v) => v instanceof Date)) {
return `${new Date(min).toISOString().split("T")[0]} - ${new Date(max).toISOString().split("T")[0]}`
}
return `${min} - ${max}`
}
case "unique": {
const nonNull = values.filter((v) => v !== undefined && v !== null && v !== "")
const unique = new Set(nonNull.map((v) => (v instanceof Date ? v.toISOString() : String(v))))
return unique.size
}
case "filled": {
const filled = values.filter((v) => v !== undefined && v !== null && v !== "")
return filled.length
}
case "missing": {
const missing = values.filter((v) => v === undefined || v === null || v === "")
return missing.length
}
case "median": {
const nums = values.filter((v) => typeof v === "number") as number[]
if (nums.length === 0) return undefined
const sorted = [...nums].sort((a, b) => a - b)
const mid = Math.floor(sorted.length / 2)
if (sorted.length % 2 === 0) {
return (sorted[mid - 1] + sorted[mid]) / 2
}
return sorted[mid]
}
case "stddev": {
const nums = values.filter((v) => typeof v === "number") as number[]
if (nums.length === 0) return undefined
const mean = nums.reduce((acc, v) => acc + v, 0) / nums.length
const variance = nums.reduce((acc, v) => acc + (v - mean) * (v - mean), 0) / nums.length
return Math.round(Math.sqrt(variance) * 100) / 100
}
case "checked":
return values.filter((v) => v === true).length
case "unchecked":
return values.filter((v) => v === false).length
case "empty": {
const count = values.filter(
(v) =>
v === undefined ||
v === null ||
v === "" ||
(Array.isArray(v) && v.length === 0) ||
(typeof v === "object" && v !== null && !Array.isArray(v) && Object.keys(v).length === 0),
).length
return count
}
case "earliest": {
const dates = values.filter(
(v) =>
v instanceof Date ||
(typeof v === "string" && /^\d{4}-\d{2}-\d{2}/.test(v)) ||
typeof v === "number",
)
if (dates.length === 0) return undefined
const timestamps = dates.map((v) => {
if (v instanceof Date) return v.getTime()
if (typeof v === "string") return new Date(v).getTime()
return v
})
const earliest = Math.min(...timestamps)
return new Date(earliest).toISOString().split("T")[0]
}
case "latest": {
const dates = values.filter(
(v) =>
v instanceof Date ||
(typeof v === "string" && /^\d{4}-\d{2}-\d{2}/.test(v)) ||
typeof v === "number",
)
if (dates.length === 0) return undefined
const timestamps = dates.map((v) => {
if (v instanceof Date) return v.getTime()
if (typeof v === "string") return new Date(v).getTime()
return v
})
const latest = Math.max(...timestamps)
return new Date(latest).toISOString().split("T")[0]
}
default:
return undefined
}
}
export function computeViewSummaries(
columns: string[],
files: QuartzPluginData[],
summaryConfig: ViewSummaryConfig | undefined,
allFiles: QuartzPluginData[] = [],
getContext: SummaryContextFactory,
valueResolver: SummaryValueResolver,
summaryExpressions?: Record<string, ProgramIR>,
): Record<string, string | number | undefined> {
const results: Record<string, string | number | undefined> = {}
if (!summaryConfig?.columns) {
return results
}
for (const column of columns) {
const summary = summaryConfig.columns[column]
if (summary) {
const expression = summaryExpressions ? summaryExpressions[column] : undefined
results[column] = computeColumnSummary(
column,
files,
summary,
allFiles,
valueResolver,
getContext,
expression,
)
}
}
return results
}

File diff suppressed because it is too large Load Diff

View File

@@ -1,31 +0,0 @@
import assert from "node:assert"
import test from "node:test"
import { parseViews, parseViewSummaries } from "./types"
test("parseViews preserves raw filters", () => {
const views = parseViews([
{ type: "table", name: "test", filters: 'status == "done"', order: ["file.name"] },
])
assert.strictEqual(views.length, 1)
assert.strictEqual(views[0].filters, 'status == "done"')
assert.deepStrictEqual(views[0].order, ["file.name"])
})
test("parseViews rejects missing type/name", () => {
assert.throws(() => parseViews([{}]))
})
test("parseViewSummaries resolves builtin and formula refs", () => {
const summaries = parseViewSummaries(
{ price: "Average", score: "avgScore", extra: "values.length" },
{ avgScore: "values.mean()" },
)
assert.ok(summaries)
if (!summaries) return
assert.strictEqual(summaries.columns.price.type, "builtin")
assert.strictEqual(summaries.columns.score.type, "formula")
assert.strictEqual(summaries.columns.score.formulaRef, "avgScore")
assert.strictEqual(summaries.columns.extra.type, "formula")
})

View File

@@ -1,119 +0,0 @@
import {
SummaryDefinition,
ViewSummaryConfig,
PropertyConfig,
BuiltinSummaryType,
BUILTIN_SUMMARY_TYPES,
} from "./compiler/schema"
export type { SummaryDefinition, ViewSummaryConfig, PropertyConfig, BuiltinSummaryType }
export { BUILTIN_SUMMARY_TYPES }
const isRecord = (value: unknown): value is Record<string, unknown> =>
typeof value === "object" && value !== null && !Array.isArray(value)
const isNonEmptyString = (value: unknown): value is string =>
typeof value === "string" && value.trim().length > 0
export type BaseFileFilter =
| string
| { and: BaseFileFilter[] }
| { or: BaseFileFilter[] }
| { not: BaseFileFilter[] }
export interface BaseFile {
filters?: BaseFileFilter
views: BaseView[]
properties?: Record<string, PropertyConfig>
summaries?: Record<string, string>
formulas?: Record<string, string>
}
export interface BaseView {
type: "table" | "list" | "gallery" | "board" | "calendar" | "card" | "cards" | "map"
name: string
order?: string[]
sort?: BaseSortConfig[]
columnSize?: Record<string, number>
groupBy?: string | BaseGroupBy
limit?: number
filters?: BaseFileFilter
summaries?: Record<string, string> | ViewSummaryConfig
image?: string
cardSize?: number
cardAspect?: number
nestedProperties?: boolean
indentProperties?: boolean
separator?: string
date?: string
dateField?: string
dateProperty?: string
coordinates?: string
markerIcon?: string
markerColor?: string
defaultZoom?: number
defaultCenter?: [number, number]
clustering?: boolean
groupSizes?: Record<string, number>
groupAspects?: Record<string, number>
}
export interface BaseSortConfig {
property: string
direction: "ASC" | "DESC"
}
export interface BaseGroupBy {
property: string
direction: "ASC" | "DESC"
}
export function parseViews(raw: unknown[]): BaseView[] {
return raw.map((entry) => {
if (!isRecord(entry)) throw new Error("Each view must be an object")
const { type, name } = entry
if (!isNonEmptyString(type) || !isNonEmptyString(name)) {
throw new Error("Each view must have 'type' and 'name' fields")
}
return { ...entry, type, name } as BaseView
})
}
export function parseViewSummaries(
viewSummaries: Record<string, string> | ViewSummaryConfig | undefined,
topLevelSummaries?: Record<string, string>,
): ViewSummaryConfig | undefined {
if (!viewSummaries || typeof viewSummaries !== "object") return undefined
if ("columns" in viewSummaries && typeof viewSummaries.columns === "object") {
return viewSummaries as ViewSummaryConfig
}
const columns: Record<string, SummaryDefinition> = {}
for (const [column, summaryValue] of Object.entries(viewSummaries)) {
if (typeof summaryValue !== "string") continue
const normalized = summaryValue.toLowerCase().trim()
if (BUILTIN_SUMMARY_TYPES.includes(normalized as BuiltinSummaryType)) {
columns[column] = { type: "builtin", builtinType: normalized as BuiltinSummaryType }
continue
}
if (topLevelSummaries && summaryValue in topLevelSummaries) {
columns[column] = {
type: "formula",
formulaRef: summaryValue,
expression: topLevelSummaries[summaryValue],
}
continue
}
if (summaryValue.includes("(") || summaryValue.includes(".")) {
columns[column] = { type: "formula", expression: summaryValue }
}
}
return Object.keys(columns).length > 0 ? { columns } : undefined
}

View File

@@ -73,7 +73,7 @@ export function slugifyFilePath(fp: FilePath, excludeExt?: boolean): FullSlug {
fp = stripSlashes(fp) as FilePath
let ext = getFileExtension(fp)
const withoutFileExt = fp.replace(new RegExp(ext + "$"), "")
if (excludeExt || [".md", ".html", ".base", undefined].includes(ext)) {
if (excludeExt || [".md", ".html", undefined].includes(ext)) {
ext = ""
}

View File

@@ -1,94 +0,0 @@
import { FilePath, FullSlug, slugifyFilePath } from "./path"
export type WikilinkWithPosition = {
wikilink: ParsedWikilink
start: number
end: number
}
export type ParsedWikilink = {
raw: string
target: string
anchor?: string
alias?: string
embed: boolean
}
export type ResolvedWikilink = {
slug: FullSlug
anchor?: string
}
const wikilinkRegex = /^!?\[\[([^\]|#]+)(?:#([^\]|]+))?(?:\|([^\]]+))?\]\]$/
export function parseWikilink(text: string): ParsedWikilink | null {
const trimmed = text.trim()
const match = wikilinkRegex.exec(trimmed)
if (!match) return null
const [, target, anchor, alias] = match
return {
raw: trimmed,
target: target?.trim() ?? "",
anchor: anchor?.trim(),
alias: alias?.trim(),
embed: trimmed.startsWith("!"),
}
}
export function resolveWikilinkTarget(
parsed: ParsedWikilink,
currentSlug: FullSlug,
): ResolvedWikilink | null {
const target = parsed.target.trim()
if (!target) return null
if (target.startsWith("/")) {
const slug = slugifyFilePath(target.slice(1).replace(/\\/g, "/") as FilePath)
return { slug, anchor: parsed.anchor }
}
const currentParts = currentSlug.split("/")
const currentDir = currentParts.slice(0, -1)
const targetParts = target.replace(/\\/g, "/").split("/")
const resolved: string[] = [...currentDir]
for (const part of targetParts) {
if (part === "..") {
resolved.pop()
} else if (part !== "." && part.length > 0) {
resolved.push(part)
}
}
const slug = slugifyFilePath(resolved.join("/") as FilePath)
return { slug, anchor: parsed.anchor }
}
const globalWikilinkRegex = /!?\[\[([^\]|#]+)(?:#([^\]|]+))?(?:\|([^\]]+))?\]\]/g
export function extractWikilinksWithPositions(text: string): WikilinkWithPosition[] {
const results: WikilinkWithPosition[] = []
let match: RegExpExecArray | null
globalWikilinkRegex.lastIndex = 0
while ((match = globalWikilinkRegex.exec(text)) !== null) {
const [fullMatch, target, anchor, alias] = match
results.push({
wikilink: {
raw: fullMatch,
target: target?.trim() ?? "",
anchor: anchor?.trim(),
alias: alias?.trim(),
embed: fullMatch.startsWith("!"),
},
start: match.index,
end: match.index + fullMatch.length,
})
}
return results
}

136
scripts/export.exs Normal file
View File

@@ -0,0 +1,136 @@
#!/usr/bin/env elixir
# Export org-roam notes (per-file) to content/ via ox-hugo.
#
# Usage:
# NOTES_DIR=~/notes elixir scripts/export.exs
# elixir scripts/export.exs /path/to/notes
#
# The positional argument takes precedence over the NOTES_DIR env var.
notes_dir =
case System.argv() do
[dir | _] -> dir
[] ->
System.get_env("NOTES_DIR") ||
(IO.puts(:stderr, "Usage: NOTES_DIR=/path/to/notes elixir scripts/export.exs"); System.halt(1))
end
notes_dir = Path.expand(notes_dir)
repo_root = __DIR__ |> Path.join("..") |> Path.expand()
content_dir = Path.join(repo_root, "content")
unless File.dir?(notes_dir) do
IO.puts(:stderr, "Error: notes directory does not exist: #{notes_dir}")
System.halt(1)
end
# Wipe content/, preserving .gitkeep
IO.puts("==> Wiping #{content_dir}")
content_dir
|> File.ls!()
|> Enum.reject(&(&1 == ".gitkeep"))
|> Enum.each(fn entry ->
Path.join(content_dir, entry) |> File.rm_rf!()
end)
# Collect all .org files
IO.puts("==> Exporting org files from #{notes_dir}")
org_files =
Path.join(notes_dir, "**/*.org")
|> Path.wildcard()
if org_files == [] do
IO.puts("No .org files found in #{notes_dir}")
System.halt(0)
end
# Export each file via emacs --batch
results =
Enum.map(org_files, fn orgfile ->
IO.puts(" exporting: #{orgfile}")
# Mirror the notes subdirectory structure under content/
section =
orgfile
|> Path.dirname()
|> Path.relative_to(notes_dir)
{output, exit_code} =
System.cmd(
"emacs",
[
"--batch",
"--eval", "(require 'ox-hugo)",
"--eval", ~s[(setq org-hugo-base-dir "#{repo_root}")],
"--eval", ~s[(setq org-hugo-default-section-directory "#{section}")],
"--visit", orgfile,
"--funcall", "org-hugo-export-to-md"
],
stderr_to_stdout: true
)
# Filter noisy emacs startup lines, same as the shell script
filtered =
output
|> String.split("\n")
|> Enum.reject(&String.match?(&1, ~r/^Loading|^ad-handle|^For information/))
|> Enum.join("\n")
if filtered != "", do: IO.puts(filtered)
{orgfile, exit_code}
end)
failures = Enum.filter(results, fn {_, code} -> code != 0 end)
if failures != [] do
IO.puts(:stderr, "\nFailed to export #{length(failures)} file(s):")
Enum.each(failures, fn {f, code} -> IO.puts(:stderr, " [exit #{code}] #{f}") end)
System.halt(1)
end
md_count =
Path.join(content_dir, "**/*.md")
|> Path.wildcard()
|> length()
# Generate a default index.md if none was exported
index_path = Path.join(content_dir, "index.md")
unless File.exists?(index_path) do
IO.puts("==> Generating default index.md")
pages =
Path.join(content_dir, "**/*.md")
|> Path.wildcard()
|> Enum.map(fn path ->
slug = Path.relative_to(path, content_dir) |> Path.rootname()
title =
path
|> File.read!()
|> then(fn content ->
case Regex.run(~r/^title\s*=\s*"(.+)"/m, content) do
[_, t] -> t
_ -> slug
end
end)
{slug, title}
end)
|> Enum.sort_by(fn {_, title} -> title end)
|> Enum.map(fn {slug, title} -> "- [#{title}](#{slug})" end)
|> Enum.join("\n")
File.write!(index_path, """
---
title: Index
---
#{pages}
""")
end
IO.puts("==> Done. #{md_count} markdown files in #{content_dir}")