River Bottega — 水

River Bottega

A personal atelier. Research, writing, music — held in one hand, working across forms.

Solo, multidisciplinary, slow.

Recent

DenseNet121 chest X-ray classifier compressed 51.43% (6.97M → 3.38M parameters) at constant accuracy (AUROC +0.0003). Compression makes channel attribution atomic enough for surgical correction: 5-channel weight zeroing reduces a target false positive by ∆prob −0.13 with zero true-positive loss and exactly zero AUROC change on the other 13 pathologies. Polarized channels reframed as bipolar discriminative axes exploiting label mutual exclusivity (Jaccard < 0.1 in 89 of 100 polarized channels, zero architectural conflicts).

Layer-level analysis of BERT on the five GLUE tasks, applying a forward-primary learning framework. Three findings: (1) separability-guided layer skip with compensation classifier — lossless compression on 3 of 5 GLUE tasks; (2) FFN's role decomposed as 92% structural (norm normalization) and 8% classification, explaining why FFN removal hurts even when individual layers look classification-harmful; (3) 60–93% of misclassifications are high-confidence errors — the BERT CLS vector itself is the fundamental limitation.

School zones, speed-limit segments, no-parking intervals are typically stored as coordinate polygons — 80–160 floating-point values per zone, requiring spatial joins at every query. Under BSI, each zone reduces to 8–12 integers and every query becomes an integer comparison. A 0-cell / 1-cell / 2-cell topological attribute schema for the entire Korean address system.

Standard routing graphs split intersection nodes for every constraint — turn restriction, time-of-day, vehicle class. Node count grows multiplicatively. The Dual Graph encodes constraints as edge attributes; node count stays fixed. On Seoul: intra-district routing in 0.9 ms, cross-district hierarchical routing in 3.84 ms — 5.3× faster than flat Dijkstra.

All works →