Dataset Viewer
Auto-converted to Parquet Duplicate
question_id
stringlengths
4
10
question
stringlengths
18
1.2k
question_source
stringclasses
14 values
answer_value
stringlengths
1
179
answer_aliases
listlengths
1
438
normalized_answer
stringlengths
0
166
answer_type
stringclasses
4 values
question_emb
list
tc_1
Which American-born Sinclair won the Nobel Prize for Literature in 1930?
http://www.triviacountry.com/
Sinclair Lewis
[ "(Harry) Sinclair Lewis", "Harry Sinclair Lewis", "Lewis, (Harry) Sinclair", "Grace Hegger", "Sinclair Lewis" ]
sinclair lewis
WikipediaEntity
[ 0.023169325664639473, -0.030406534671783447, -0.07039284706115723, 0.0454985573887825, 0.009289265610277653, 0.07207824289798737, 0.09149531275033951, 0.027861332520842552, 0.011133934371173382, 0.07158803939819336, -0.024399645626544952, 0.09474774450063705, -0.03153642639517784, 0.024326...
tc_3
Where in England was Dame Judi Dench born?
http://www.triviacountry.com/
York
[ "Park Grove (1895)", "York UA", "Yorkish", "UN/LOCODE:GBYRK", "York, UK", "Eoforwic", "Park Grove School", "York Ham", "The weather in York", "City of York", "York, England", "York, Yorkshire", "York ham", "County Borough of York", "YORK", "Eoferwic", "Park Grove Primary School", "...
york
WikipediaEntity
[ 0.04752374440431595, -0.004885627888143063, -0.01350818108767271, -0.02971527725458145, -0.01296195574104786, 0.07503002136945724, -0.021380098536610603, -0.00786353088915348, -0.011738622561097145, 0.0684937834739685, 0.0048612989485263824, -0.11780311912298203, -0.0340106375515461, -0.07...
tc_5
In which decade did Billboard magazine first publish and American hit chart?
http://www.triviacountry.com/
30s
[ "30's", "30’s", "30s", "30s AD", "30-39" ]
30s
WikipediaEntity
[ 0.08651887625455856, -0.0514783151447773, -0.10441441833972931, 0.07962378114461899, 0.017974043264985085, 0.08217453211545944, -0.046744298189878464, 0.00568322092294693, -0.056948594748973846, -0.03368859365582466, 0.015830010175704956, 0.04437496140599251, 0.10326030850410461, 0.0214751...
tc_8
From which country did Angola achieve independence in 1975?
http://www.triviacountry.com/
Portugal
[ "Portogało", "Republic of Portugal", "PORTUGAL", "Portekiz", "Portugallu", "O Papagaio", "ISO 3166-1:PT", "Portunga", "Phu-to-ga", "Potigal", "Portûnga", "Portugul", "An Phortaingéil", "Portugāle", "Portugale", "Portingale", "Potiti", "Portugali", "Portugall", "Portekîz", "Bo...
portugal
WikipediaEntity
[ -0.05920160934329033, 0.05574307218194008, -0.03240777179598808, 0.11692319810390472, 0.019090892747044563, -0.0506412573158741, 0.0038206761237233877, -0.024617960676550865, -0.03810102865099907, 0.04961460456252098, 0.05663755536079407, -0.08344381302595139, -0.026640282943844795, 0.0576...
tc_9
Which city does David Soul come from?
http://www.triviacountry.com/
Chicago
[ "Chi-Beria", "Sayre language academy", "Chicago", "Chicago, Illinois", "Hog Butcher for the World", "Land of smelly onions", "Ariel Community Academy", "The weather in Chicago", "Chicago, Illinois, U.S.A.", "Chicago, Illionis", "Near North Montessori", "Religion in Chicago", "Chicago Finance...
chicago
WikipediaEntity
[ 0.06124470755457878, 0.15526311099529266, -0.02029101923108101, -0.01823659986257553, -0.032905302941799164, 0.06170681118965149, 0.04655097424983978, -0.11728964745998383, 0.01037870068103075, 0.07055146247148514, -0.03160538524389267, -0.06546007096767426, -0.03715747594833374, -0.039915...
tc_10
Who won Super Bowl XX?
http://www.triviacountry.com/
Chicago Bears
[ "Chicago Bears", "Chicago Staleys", "Decatur Staleys", "Chicago Bears football", "Chicago bears", "Save Da Planet", "Chicago Gators" ]
chicago bears
WikipediaEntity
[ -0.040907472372055054, 0.04029460996389389, -0.05311192572116852, -0.044142257422208786, 0.004738742019981146, 0.05789303779602051, 0.05285423994064331, 0.03351844474673271, 0.05744193121790886, 0.028670141473412514, -0.08925320953130722, -0.002925639972090721, 0.0948374941945076, -0.01345...
tc_11
Which was the first European country to abolish capital punishment?
http://www.triviacountry.com/
Norway
[ "Norvège", "Mainland Norway", "Norway", "Norvege", "Noregur", "NORWAY", "Norwegian state", "Etymology of Norway", "Noruega", "Norwegen", "ISO 3166-1:NO", "Noreg", "Republic of Norway", "Norwegian kingdom", "Kongeriket Noreg", "Name of Norway", "Kongeriket Norge", "Noorwegen", "Ki...
norway
WikipediaEntity
[ 0.04978356510400772, 0.09942281246185303, -0.12599368393421173, -0.05261523649096489, 0.026605745777487755, 0.05195906385779381, 0.016049515455961227, -0.045578692108392715, -0.015607531182467937, 0.05576194077730179, 0.12862978875637054, 0.027098823338747025, -0.02946905978024006, 0.06681...
tc_15
In which country did he widespread use of ISDN begin in 1988?
http://www.triviacountry.com/
Japan
[ "日本國", "State of Japan", "Ja-pan", "Nihon", "Nippon", "Japang", "Modern–era Japan", "Etymology of Japan", "Japan (country)", "Republic of Japan", "Japanese financial sector", "Nihon-koku", "ISO 3166-1:JP", "日本", "Japian", "Japan", "JapaN", "The State of Japan", "Riben", "Nippon...
japan
WikipediaEntity
[ -0.0361785814166069, 0.024634787812829018, 0.044502947479486465, 0.021195970475673676, 0.01775924675166607, 0.06840086728334427, 0.045256562530994415, 0.12696196138858795, -0.027636192739009857, 0.042680930346250534, 0.015016299672424793, 0.11832540482282639, -0.01634513959288597, 0.087554...
tc_16
What is Bruce Willis' real first name?
http://www.triviacountry.com/
Walter
[ "Walter (TV Series)", "Walter", "Walter (disambiguation)", "Walter (TV series)" ]
walter
WikipediaEntity
[ -0.12045761942863464, -0.03706871345639229, -0.11397518962621689, 0.06705824285745621, -0.09309973567724228, -0.05791829898953438, 0.10885172337293625, 0.008734261617064476, 0.004739760421216488, -0.039038728922605515, 0.02531038224697113, -0.09572898596525192, 0.03439547121524811, 0.03639...
tc_17
Which William wrote the novel Lord Of The Flies?
http://www.triviacountry.com/
Golding
[ "Golding", "Golding (surname)", "Golding (disambiguation)" ]
golding
WikipediaEntity
[ 0.02235744521021843, 0.02580886334180832, 0.04897703975439072, 0.06022774428129196, -0.023392707109451294, 0.020914992317557335, 0.03141666576266289, -0.04390065371990204, -0.011144408024847507, 0.03044544719159603, -0.015010186471045017, 0.07189875841140747, -0.02960575371980667, -0.03350...
tc_18
Which innovation for the car was developed by Prince Henry of Prussia in 1911?
http://www.triviacountry.com/
Windshield wipers
[ "Rear-window wiper", "Headlight washer", "Windshield wiper", "Windshield wipers", "Wipers (car)", "Headlamp wiper", "Windscreen wipers", "MAGIC VISION CONTROL", "Intermittent windshield wiper", "Windscreen washer", "Headlight wiper", "Headlamp washer", "Wiper blade", "Windshield washer", ...
windshield wipers
WikipediaEntity
[ -0.1088128536939621, 0.07205507904291153, -0.016627201810479164, -0.011276153847575188, 0.003056945977732539, 0.08482358604669571, -0.02596292644739151, 0.1133842021226883, -0.08734442293643951, -0.0615699402987957, 0.02737276628613472, 0.0301031656563282, 0.03228013962507248, -0.052453614...
tc_19
How is musician William Lee Conley better known?
http://www.triviacountry.com/
Big Bill Broonzy
[ "Bill Broonzy", "Big Bill Broonzey", "William Lee Conley Broonzy", "Big Bill Broonzy", "William Broonzy" ]
big bill broonzy
WikipediaEntity
[ 0.01703054830431938, 0.022055892273783684, -0.04397841542959213, 0.00042515317909419537, -0.10152866691350937, 0.011738866567611694, 0.08245404809713364, -0.010094705037772655, -0.03875464200973511, -0.025006454437971115, 0.02093619666993618, -0.06728779524564743, 0.03038228303194046, -0.0...
tc_21
How is Joan Molinsky better known?
http://www.triviacountry.com/
Joan Rivers
[ "Queen of Comedy", "Heidi Abromowitz", "Joan Rivers (TV) Show", "Joan Alexandra Molinsky", "Diary of a Mad Diva", "Joan rivers", "Heidi abromowitz", "Joan River", "Joan Rivers Show", "Joan Rivers" ]
joan rivers
WikipediaEntity
[ 0.028838686645030975, -0.050041139125823975, -0.04912107065320015, 0.030553365126252174, 0.06072792783379555, 0.01945258304476738, 0.04674778878688812, 0.06627627462148666, 0.0057186707854270935, -0.0456971600651741, -0.02845982275903225, 0.060580093413591385, -0.004938929341733456, -0.002...
tc_22
In which branch of the arts is Patricia Neary famous?
http://www.triviacountry.com/
Ballet
[ "Ballet", "Ballet competitions", "Ballet schools", "Balet, India", "Balletti", "Ballet dancing", "1938 ballet premieres", "Balletto", "Balletomane", "2011 ballet premieres", "1940 ballet premieres", "Balletomanes", "1939 ballet premieres", "Ballet characters", "Ballet teachers", "Balle...
ballet
WikipediaEntity
[ 0.04416121542453766, -0.00652533071115613, -0.0009445230243727565, -0.04352923110127449, -0.06609247624874115, 0.06999315321445465, -0.030689898878335953, 0.01775413751602173, -0.008547736331820488, 0.029907822608947754, -0.06846927851438522, -0.022681448608636856, -0.033526889979839325, -...
tc_23
Which country is Europe's largest silk producer?
http://www.triviacountry.com/
Italy
[ "Environment of Italy", "Italiën", "Subdivisions of Italy", "Republic of Italy", "ItalY", "ISO 3166-1:IT", "Etymology of Italy", "Itali", "Pollution in Italy", "Administrative divisions of Italy", "Austrian Empire (Italy)", "Italija", "Italie", "Italia", "Italian Republic", "Second Ita...
italy
WikipediaEntity
[ 0.016399342566728592, 0.014141676016151905, -0.043557360768318176, 0.02756260521709919, 0.032482121139764786, -0.012791451998054981, -0.005946739111095667, -0.03657308220863342, -0.0794256404042244, -0.0609918050467968, 0.06744349002838135, -0.05234075337648392, -0.06393761187791824, 0.043...
tc_24
The VS-300 was a type of what?
http://www.triviacountry.com/
Helicopter
[ "🚁", "Helicopters", "Civilian helicopter", "Pescara (helicopter)", "Cargo helicopter", "Copter", "Helecopter", "List of deadliest helicopter crashes", "Helichopper", "Helocopter", "Cargo Helicopter", "Helicopter", "Helicoptor", "Anatomy of a helicopter" ]
helicopter
WikipediaEntity
[ -0.10964345932006836, 0.020844019949436188, -0.09409292787313461, -0.03595757484436035, -0.038714516907930374, 0.027419360354542732, -0.08690785616636276, 0.06689354032278061, -0.08950431644916534, 0.021219849586486816, 0.03079281561076641, 0.035078346729278564, -0.007456370163708925, -0.0...
tc_25
At which university did Joseph Goebbels become a doctor of philosophy?
http://www.triviacountry.com/
Heidelberg
[ "Heidelberg romantics", "UN/LOCODE:DEHEI", "Heidelberg, West Germany", "Wieblingen", "Heidelberg", "Heidelberg, Germany" ]
heidelberg
WikipediaEntity
[ 0.02659054473042488, 0.06210885941982269, -0.05765410140156746, 0.0009278468787670135, -0.06864457577466965, 0.007288170047104359, -0.014437803067266941, 0.030409570783376694, -0.00020919564121868461, -0.07413353025913239, -0.007535172626376152, 0.039598122239112854, -0.03434043377637863, ...
tc_26
Which prince is Queen Elizabeth II's youngest son?
http://www.triviacountry.com/
Edward
[ "Eadweard", "Edward" ]
edward
WikipediaEntity
[ 0.04049056023359299, 0.028727911412715912, 0.04380517825484276, 0.030984189361333847, 0.0390685498714447, -0.018353363499045372, 0.01031123660504818, 0.004296185448765755, -0.0737345889210701, 0.018328187987208366, -0.08140302449464798, -0.028109246864914894, 0.003253248753026128, -0.06150...
tc_27
When did the founder of Jehovah's Witnesses say the world would end?
http://www.triviacountry.com/
1914
[ "one thousand, nine hundred and fourteen", "1914" ]
1914
Numerical
[ -0.04112902283668518, 0.11318378895521164, 0.03601811081171036, 0.06107861176133156, -0.04367787763476372, -0.0693797916173935, -0.07185114175081253, -0.00029754952993243933, 0.008842159993946552, 0.02229258418083191, -0.024448323994874954, 0.02772526815533638, -0.006146460305899382, 0.022...
tc_28
Who found the remains of the Titanic?
http://www.triviacountry.com/
Robert Ballard
[ "Bob Ballard", "Robert Duane Ballard", "Robert Ballard", "Robert %22Bob%22 Ballard", "Robert d. ballard" ]
robert ballard
WikipediaEntity
[ -0.09524061530828476, 0.08917678147554398, 0.019546858966350555, 0.0982854813337326, 0.04392252117395401, 0.015064929611980915, -0.04164867475628853, -0.0026691844686865807, -0.0010562367970123887, -0.013470296747982502, -0.03647487983107567, -0.003871750319376588, -0.03798362612724304, 0....
tc_30
Who was the only Spice Girl not to have a middle name?
http://www.triviacountry.com/
Posh Spice
[ "My Love Is For Real (Victoria Beckham song", "Romeo James Beckham", "Harper Seven Beckham", "Posh Spice", "Victoria addams", "Victoria Beckham", "Victoria Adams Beckham", "Harper Beckham", "Victoria Caroline Beckham", "I'd Give It All Away", "Come Together (Victoria Beckham album)", "Posh spi...
posh spice
WikipediaEntity
[ -0.05416562780737877, -0.05625636875629425, -0.02119278348982334, 0.038305751979351044, -0.03191437944769859, 0.019930848851799965, 0.035209935158491135, 0.03321920335292816, 0.022241080179810524, -0.03767172992229462, 0.09725411981344223, -0.082635298371315, 0.06665951758623123, -0.066725...
tc_31
What are the international registration letters of a vehicle from Algeria?
http://www.triviacountry.com/
DZ
[ "DZ (disambiguation)", "Dz.", "DZs", "Dz", "D.z.", "DZ", "D Z", "D.Z." ]
dz
WikipediaEntity
[ -0.05865120515227318, -0.0028281663544476032, -0.09776101261377335, -0.06259207427501678, -0.062145959585905075, 0.026477089151740074, -0.03011632151901722, 0.003672227030619979, 0.020383019000291824, -0.062038354575634, 0.09637778997421265, -0.008229505270719528, 0.053455691784620285, -0....
tc_32
How did Jock die in Dallas?
http://www.triviacountry.com/
Helicopter accident
[ "Helicopter accident" ]
helicopter accident
FreeForm
[ 0.08124176412820816, 0.011415379121899605, -0.0022691211197525263, -0.02774479053914547, 0.07038690149784088, 0.029935112223029137, 0.03459681198000908, 0.03333287686109543, -0.042227983474731445, 0.02615705505013466, 0.030803440138697624, 0.02640126459300518, 0.07019811123609543, -0.01725...
End of preview. Expand in Data Studio

YAML Metadata Warning:The task_categories "lance" is not in the official list: text-classification, token-classification, table-question-answering, question-answering, zero-shot-classification, translation, summarization, feature-extraction, text-generation, fill-mask, sentence-similarity, text-to-speech, text-to-audio, automatic-speech-recognition, audio-to-audio, audio-classification, audio-text-to-text, voice-activity-detection, depth-estimation, image-classification, object-detection, image-segmentation, text-to-image, image-to-text, image-to-image, image-to-video, unconditional-image-generation, video-classification, reinforcement-learning, robotics, tabular-classification, tabular-regression, tabular-to-text, table-to-text, multiple-choice, text-ranking, text-retrieval, time-series-forecasting, text-to-video, image-text-to-text, image-text-to-image, image-text-to-video, visual-question-answering, document-question-answering, zero-shot-image-classification, graph-ml, mask-generation, zero-shot-object-detection, text-to-3d, image-to-3d, image-feature-extraction, video-text-to-text, keypoint-detection, visual-document-retrieval, any-to-any, video-to-video, other

TriviaQA (Lance Format)

A Lance-formatted version of TriviaQA (rc.nocontext config) — a large reading-comprehension dataset of trivia questions paired with a canonical answer, accepted aliases, and entity-type metadata — with MiniLM question embeddings stored inline and ready for retrieval at hf://datasets/lance-format/trivia-qa-lance/data. The rc.nocontext slice is the standard reading-comprehension form without the multi-gigabyte entity_pages / search_results payloads, which keeps the dataset compact while preserving everything needed for closed-book QA, retrieval research, and as a search target.

Key features

  • 138k+ trivia questions with a canonical answer_value, normalized form for exact-match scoring, and a list of accepted answer_aliases.
  • Pre-computed 384-dim question embeddings (question_emb, sentence-transformers/all-MiniLM-L6-v2, cosine-normalized) with a bundled IVF_PQ index for semantic question retrieval.
  • Full-text inverted index on question for keyword search and hybrid retrieval.
  • One columnar dataset carrying questions, canonical answers, aliases, types, and embeddings together — project only the columns each query needs.

Splits

Split Rows
train.lance 138,384
validation.lance 17,944

Schema

Column Type Notes
question_id string TriviaQA question id (e.g. tc_1); natural join key for merges
question string The trivia question
question_source string URL or source the question came from
answer_value string Canonical answer
answer_aliases list<string> Other accepted phrasings (e.g. ["Sinclair Lewis", "Harry Sinclair Lewis"])
normalized_answer string Lowercased / normalized form for exact-match scoring
answer_type string TriviaQA entity type (e.g. WikipediaEntity, FreebaseEntity)
question_emb fixed_size_list<float32, 384> MiniLM embedding of question (cosine-normalized)

Pre-built indices

  • IVF_PQ on question_embmetric=cosine, vector similarity search
  • INVERTED on question — full-text search
  • BTREE on question_id and answer_value — point lookups and prefix scans
  • BITMAP on answer_type — fast filtering by entity type

Why Lance?

  1. Blazing Fast Random Access: Optimized for fetching scattered rows, making it ideal for random sampling, real-time ML serving, and interactive applications without performance degradation.
  2. Native Multimodal Support: Store text, embeddings, and other data types together in a single file. Large binary objects are loaded lazily, and vectors are optimized for fast similarity search.
  3. Native Index Support: Lance comes with fast, on-disk, scalable vector and FTS indexes that sit right alongside the dataset on the Hub, so you can share not only your data but also your embeddings and indexes without your users needing to recompute them.
  4. Efficient Data Evolution: Add new columns and backfill data without rewriting the entire dataset. This is perfect for evolving ML features, adding new embeddings, or introducing moderation tags over time.
  5. Versatile Querying: Supports combining vector similarity search, full-text search, and SQL-style filtering in a single query, accelerated by on-disk indexes.
  6. Data Versioning: Every mutation commits a new version; previous versions remain intact on disk. Tags pin a snapshot by name, so retrieval systems and training runs can reproduce against an exact slice of history.

Load with datasets.load_dataset

You can load Lance datasets via the standard HuggingFace datasets interface, suitable when your pipeline already speaks Dataset / IterableDataset or you want a quick streaming sample.

import datasets

hf_ds = datasets.load_dataset("lance-format/trivia-qa-lance", split="train", streaming=True)
for row in hf_ds.take(3):
    print(row["question"], "->", row["answer_value"])

Load with LanceDB

LanceDB is the embedded retrieval library built on top of the Lance format (docs), and is the interface most users interact with. It wraps the dataset as a queryable table with search and filter builders, and is the entry point used by the Search, Curate, Evolve, Versioning, and Materialize-a-subset sections below.

import lancedb

db = lancedb.connect("hf://datasets/lance-format/trivia-qa-lance/data")
tbl = db.open_table("train")
print(len(tbl))

Load with Lance

pylance is the Python binding for the Lance format and works directly with the format's lower-level APIs. Reach for it when you want to inspect dataset internals — schema, scanner, fragments, the list of pre-built indices.

import lance

ds = lance.dataset("hf://datasets/lance-format/trivia-qa-lance/data/train.lance")
print(ds.count_rows(), ds.schema.names)
print(ds.list_indices())

Tip — for production use, download locally first. Streaming from the Hub works for exploration, but heavy random access and ANN search are far faster against a local copy:

hf download lance-format/trivia-qa-lance --repo-type dataset --local-dir ./trivia-qa-lance

Then point Lance or LanceDB at ./trivia-qa-lance/data.

Search

The bundled IVF_PQ index on question_emb turns semantic retrieval over trivia questions into a single call. In production you would encode an incoming question through the same MiniLM encoder used at ingest and pass the resulting 384-dim vector to tbl.search(...). The example below uses the embedding from row 42 as a runnable stand-in.

import lancedb

db = lancedb.connect("hf://datasets/lance-format/trivia-qa-lance/data")
tbl = db.open_table("train")

seed = (
    tbl.search()
    .select(["question_emb", "question"])
    .limit(1)
    .offset(42)
    .to_list()[0]
)

hits = (
    tbl.search(seed["question_emb"], vector_column_name="question_emb")
    .metric("cosine")
    .select(["question_id", "question", "answer_value", "answer_aliases"])
    .limit(10)
    .to_list()
)
for r in hits:
    print(f"{r['answer_value']:30s} | {r['question'][:80]}")

Because the recommended setup also builds an INVERTED index on question, the same query can be issued as a hybrid search that combines the dense vector with a keyword query. LanceDB merges and reranks the two result lists in a single call, which is useful when a specific named entity must literally appear in the question but the dense side should still drive ranking.

hybrid_hits = (
    tbl.search(query_type="hybrid")
    .vector(seed["question_emb"])
    .text("sistine chapel")
    .select(["question_id", "question", "answer_value", "answer_aliases"])
    .limit(10)
    .to_list()
)
for r in hybrid_hits:
    print(f"{r['answer_value']:30s} | {r['question'][:80]}")

Tune metric, nprobes, and refine_factor on the vector side to trade recall against latency.

Curate

TriviaQA's answer_type column — backed by a BITMAP index — makes it cheap to slice the dataset by entity category, and the question text itself is a useful predicate for filtering out very short or unusually long items. Stacking predicates inside a single filtered scan keeps the result small and explicit, and the bounded .limit(1000) makes it easy to inspect or hand off.

import lancedb

db = lancedb.connect("hf://datasets/lance-format/trivia-qa-lance/data")
tbl = db.open_table("train")

candidates = (
    tbl.search()
    .where(
        "answer_type = 'WikipediaEntity' "
        "AND length(question) BETWEEN 60 AND 300",
        prefilter=True,
    )
    .select(["question_id", "question", "answer_value", "answer_aliases"])
    .limit(1000)
    .to_list()
)
print(f"{len(candidates)} candidates; first answer: {candidates[0]['answer_value']}")

Neither the question_emb vector nor the unused alias fields drive this scan, so a 1000-row curation pass against the Hub moves only the projected text columns. The result is a plain list of dictionaries, ready to inspect, persist as a manifest of question ids, or hand to the Materialize-a-subset section below for export to a writable local copy.

Evolve

Lance stores each column independently, so a new column can be appended without rewriting the existing data. The lightest form is a SQL expression: derive the new column from columns that already exist, and Lance computes it once and persists it. The example below adds a question_length, a num_aliases count, and a has_aliases flag — any of which can then be used directly in where clauses without recomputing the predicate on every query.

Note: Mutations require a local copy of the dataset, since the Hub mount is read-only. See the Materialize-a-subset section at the end of this card for a streaming pattern that downloads only the rows and columns you need, or use hf download to pull the full corpus.

import lancedb

db = lancedb.connect("./trivia-qa-lance/data")  # local copy required for writes
tbl = db.open_table("train")

tbl.add_columns({
    "question_length": "length(question)",
    "num_aliases": "array_length(answer_aliases)",
    "has_aliases": "array_length(answer_aliases) > 0",
})

If the values you want to attach already live in another table (offline reader-model predictions, alternate embeddings, retrieval scores from a different system), merge them in by joining on question_id:

import pyarrow as pa

scores = pa.table({
    "question_id": pa.array(["tc_1", "tc_2"]),
    "retriever_score": pa.array([0.88, 0.31]),
})
tbl.merge(scores, on="question_id")

The original columns and indices are untouched, so existing code that does not reference the new columns continues to work unchanged. New columns become visible to every reader as soon as the operation commits. For column values that require a Python computation (e.g., running a different embedding model over the questions), Lance provides a batch-UDF API — see the Lance data evolution docs.

Train

Projection lets a training loop read only the columns each step actually needs. LanceDB tables expose this through Permutation.identity(tbl).select_columns([...]), which plugs straight into the standard torch.utils.data.DataLoader so prefetching, shuffling, and batching behave as in any PyTorch pipeline. For a closed-book QA model the natural projection is the question, the canonical answer, and the alias list (the aliases serve as additional supervision targets during loss computation or evaluation); for a retriever or reranker on top of frozen features, project the precomputed embedding instead.

import lancedb
from lancedb.permutation import Permutation
from torch.utils.data import DataLoader

db = lancedb.connect("hf://datasets/lance-format/trivia-qa-lance/data")
tbl = db.open_table("train")

train_ds = Permutation.identity(tbl).select_columns(
    ["question", "answer_value", "answer_aliases"]
)
loader = DataLoader(train_ds, batch_size=64, shuffle=True, num_workers=4)

for batch in loader:
    # batch carries only the projected columns; question_emb stays on disk.
    # tokenize question and answer, forward, backward...
    ...

Switching feature sets is a configuration change: passing ["question_emb", "answer_value"] to select_columns(...) on the next run reads only the 384-d vectors and the canonical answer string, which is the right shape for training a retrieval head or reranker on cached embeddings. Columns added in Evolve cost nothing per batch until they are explicitly projected.

Versioning

Every mutation to a Lance dataset, whether it adds a column, merges labels, or builds an index, commits a new version. Previous versions remain intact on disk. You can list versions and inspect the history directly from the Hub copy; creating new tags requires a local copy since tags are writes.

import lancedb

db = lancedb.connect("hf://datasets/lance-format/trivia-qa-lance/data")
tbl = db.open_table("train")

print("Current version:", tbl.version)
print("History:", tbl.list_versions())
print("Tags:", tbl.tags.list())

Once you have a local copy, tag a version for reproducibility:

local_db = lancedb.connect("./trivia-qa-lance/data")
local_tbl = local_db.open_table("train")
local_tbl.tags.create("baseline-v1", local_tbl.version)

A tagged version can be opened by name, or any version reopened by its number, against either the Hub copy or a local one:

tbl_v1 = db.open_table("train", version="baseline-v1")
tbl_v5 = db.open_table("train", version=5)

Pinning supports two workflows. A retrieval system locked to baseline-v1 keeps returning stable results while the dataset evolves in parallel — newly added scores or alternate embeddings do not change what the tag resolves to. A training experiment pinned to the same tag can be rerun later against the exact same questions and answers, so changes in metrics reflect model changes rather than data drift. Neither workflow needs shadow copies or external manifest tracking.

Materialize a subset

Reads from the Hub are lazy, so exploratory queries only transfer the columns and row groups they touch. Mutating operations (Evolve, tag creation) need a writable backing store, and a training loop benefits from a local copy with fast random access. Both can be served by a subset of the dataset rather than the full corpus. The pattern is to stream a filtered query through .to_batches() into a new local table; only the projected columns and matching row groups cross the wire, and the bytes never fully materialize in Python memory.

import lancedb

remote_db = lancedb.connect("hf://datasets/lance-format/trivia-qa-lance/data")
remote_tbl = remote_db.open_table("train")

batches = (
    remote_tbl.search()
    .where("answer_type = 'WikipediaEntity' AND length(question) >= 60")
    .select(
        ["question_id", "question", "answer_value", "answer_aliases",
         "normalized_answer", "answer_type", "question_emb"]
    )
    .to_batches()
)

local_db = lancedb.connect("./trivia-qa-wiki")
local_db.create_table("train", batches)

The resulting ./trivia-qa-wiki is a first-class LanceDB database. Every snippet in the Search, Evolve, Train, and Versioning sections above works against it by swapping hf://datasets/lance-format/trivia-qa-lance/data for ./trivia-qa-wiki.

Source & license

Converted from mandarjoshi/trivia_qa (rc.nocontext config). TriviaQA is released under the Apache 2.0 license.

Citation

@article{joshi2017triviaqa,
  title={TriviaQA: A Large Scale Distantly Supervised Challenge Dataset for Reading Comprehension},
  author={Joshi, Mandar and Choi, Eunsol and Weld, Daniel S and Zettlemoyer, Luke},
  journal={arXiv preprint arXiv:1705.03551},
  year={2017}
}
Downloads last month
64

Paper for lance-format/trivia-qa-lance