datasetId
stringlengths 2
117
| card
stringlengths 19
1.01M
|
---|---|
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_190 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1355312180.0
num_examples: 266165
download_size: 1383051494
dataset_size: 1355312180.0
---
# Dataset Card for "chunk_190"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Paul0fernando/paulofernando | ---
license: openrail
---
|
danavery/urbansound8K | ---
language:
- en
license: cc-by-nc-4.0
size_categories:
- 1K<n<10K
task_categories:
- audio-classification
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: audio
dtype: audio
- name: slice_file_name
dtype: string
- name: fsID
dtype: int64
- name: start
dtype: float64
- name: end
dtype: float64
- name: salience
dtype: int64
- name: fold
dtype: int64
- name: classID
dtype: int64
- name: class
dtype: string
splits:
- name: train
num_bytes: 7605141208.66
num_examples: 8732
download_size: 6998085428
dataset_size: 7605141208.66
---
(card and dataset copied from https://www.kaggle.com/datasets/chrisfilo/urbansound8k)
This dataset contains 8732 labeled sound excerpts (<=4s) of urban sounds from 10 classes: `air_conditioner`, `car_horn`, `children_playing`, `dog_bark`, `drilling`, `enginge_idling`, `gun_shot`, `jackhammer`, `siren`, and `street_music`. The classes are drawn from the urban sound taxonomy. For a detailed description of the dataset and how it was compiled please refer to our paper.All excerpts are taken from field recordings uploaded to www.freesound.org. The files are pre-sorted into ten folds (folders named fold1-fold10) to help in the reproduction of and comparison with the automatic classification results reported in the article above.
In addition to the sound excerpts, a CSV file containing metadata about each excerpt is also provided.
## AUDIO FILES INCLUDED
8732 audio files of urban sounds (see description above) in WAV format. The sampling rate, bit depth, and number of channels are the same as those of the original file uploaded to Freesound (and hence may vary from file to file).
## META-DATA FILES INCLUDED
```
UrbanSound8k.csv
```
This file contains meta-data information about every audio file in the dataset. This includes:
* slice_file_name:
The name of the audio file. The name takes the following format: [fsID]-[classID]-[occurrenceID]-[sliceID].wav, where:
[fsID] = the Freesound ID of the recording from which this excerpt (slice) is taken
[classID] = a numeric identifier of the sound class (see description of classID below for further details)
[occurrenceID] = a numeric identifier to distinguish different occurrences of the sound within the original recording
[sliceID] = a numeric identifier to distinguish different slices taken from the same occurrence
* fsID:
The Freesound ID of the recording from which this excerpt (slice) is taken
* start
The start time of the slice in the original Freesound recording
* end:
The end time of slice in the original Freesound recording
* salience:
A (subjective) salience rating of the sound. 1 = foreground, 2 = background.
* fold:
The fold number (1-10) to which this file has been allocated.
* classID:
A numeric identifier of the sound class:
0 = air_conditioner
1 = car_horn
2 = children_playing
3 = dog_bark
4 = drilling
5 = engine_idling
6 = gun_shot
7 = jackhammer
8 = siren
9 = street_music
* class:
The class name: air_conditioner, car_horn, children_playing, dog_bark, drilling, engine_idling, gun_shot, jackhammer,
siren, street_music.
## BEFORE YOU DOWNLOAD: AVOID COMMON PITFALLS!
Since releasing the dataset we have noticed a couple of common mistakes that could invalidate your results, potentially leading to manuscripts being rejected or the publication of incorrect results. To avoid this, please read the following carefully:
1. Don't reshuffle the data! Use the predefined 10 folds and perform 10-fold (not 5-fold) cross validation
The experiments conducted by vast majority of publications using UrbanSound8K (by ourselves and others) evaluate classification models via 10-fold cross validation using the predefined splits*. We strongly recommend following this procedure.
Why?
If you reshuffle the data (e.g. combine the data from all folds and generate a random train/test split) you will be incorrectly placing related samples in both the train and test sets, leading to inflated scores that don't represent your model's performance on unseen data. Put simply, your results will be wrong.
Your results will NOT be comparable to previous results in the literature, meaning any claims to an improvement on previous research will be invalid. Even if you don't reshuffle the data, evaluating using different splits (e.g. 5-fold cross validation) will mean your results are not comparable to previous research.
2. Don't evaluate just on one split! Use 10-fold (not 5-fold) cross validation and average the scores
We have seen reports that only provide results for a single train/test split, e.g. train on folds 1-9, test on fold 10 and report a single accuracy score. We strongly advise against this. Instead, perform 10-fold cross validation using the provided folds and report the average score.
Why?
Not all the splits are as \"easy\". That is, models tend to obtain much higher scores when trained on folds 1-9 and tested on fold 10, compared to (e.g.) training on folds 2-10 and testing on fold 1. For this reason, it is important to evaluate your model on each of the 10 splits and report the average accuracy.
Again, your results will NOT be comparable to previous results in the literature.
## Acknowledgements
We kindly request that articles and other works in which this dataset is used cite the following paper:
J. Salamon, C. Jacoby and J. P. Bello, \"A Dataset and Taxonomy for Urban Sound Research\", 22nd ACM International Conference on Multimedia, Orlando USA, Nov. 2014.
More information at https://urbansounddataset.weebly.com/urbansound8k.html |
logasja/lfw | ---
dataset_info:
- config_name: aug
features:
- name: orig
dtype: image
- name: aug
dtype: image
splits:
- name: train
num_bytes: 502317747.058
num_examples: 13233
download_size: 473483260
dataset_size: 502317747.058
- config_name: default
features:
- name: label
dtype:
class_label:
names:
'0': AJ_Cook
'1': AJ_Lamas
'2': Aaron_Eckhart
'3': Aaron_Guiel
'4': Aaron_Patterson
'5': Aaron_Peirsol
'6': Aaron_Pena
'7': Aaron_Sorkin
'8': Aaron_Tippin
'9': Abba_Eban
'10': Abbas_Kiarostami
'11': Abdel_Aziz_Al-Hakim
'12': Abdel_Madi_Shabneh
'13': Abdel_Nasser_Assidi
'14': Abdoulaye_Wade
'15': Abdul_Majeed_Shobokshi
'16': Abdul_Rahman
'17': Abdulaziz_Kamilov
'18': Abdullah
'19': Abdullah_Ahmad_Badawi
'20': Abdullah_Gul
'21': Abdullah_Nasseef
'22': Abdullah_al-Attiyah
'23': Abdullatif_Sener
'24': Abel_Aguilar
'25': Abel_Pacheco
'26': Abid_Hamid_Mahmud_Al-Tikriti
'27': Abner_Martinez
'28': Abraham_Foxman
'29': Aby_Har-Even
'30': Adam_Ant
'31': Adam_Freier
'32': Adam_Herbert
'33': Adam_Kennedy
'34': Adam_Mair
'35': Adam_Rich
'36': Adam_Sandler
'37': Adam_Scott
'38': Adel_Al-Jubeir
'39': Adelina_Avila
'40': Adisai_Bodharamik
'41': Adolfo_Aguilar_Zinser
'42': Adolfo_Rodriguez_Saa
'43': Adoor_Gopalakarishnan
'44': Adrian_Annus
'45': Adrian_Fernandez
'46': Adrian_McPherson
'47': Adrian_Murrell
'48': Adrian_Nastase
'49': Adriana_Lima
'50': Adriana_Perez_Navarro
'51': Adrianna_Zuzic
'52': Adrien_Brody
'53': Afton_Smith
'54': Agbani_Darego
'55': Agnelo_Queiroz
'56': Agnes_Bruckner
'57': Ahmad_Jbarah
'58': Ahmad_Masood
'59': Ahmed_Ahmed
'60': Ahmed_Chalabi
'61': Ahmed_Ghazi
'62': Ahmed_Ibrahim_Bilal
'63': Ahmed_Lopez
'64': Ahmed_Qureia
'65': Ahmet_Demir
'66': Ahmet_Necdet_Sezer
'67': Ai_Sugiyama
'68': Aicha_El_Ouafi
'69': Aidan_Quinn
'70': Aileen_Riggin_Soule
'71': Ain_Seppik
'72': Ainsworth_Dyer
'73': Aishwarya_Rai
'74': Aitor_Gonzalez
'75': Aiysha_Smith
'76': Ajit_Agarkar
'77': Akbar_Al_Baker
'78': Akbar_Hashemi_Rafsanjani
'79': Akhmed_Zakayev
'80': Akiko_Morigami
'81': Akmal_Taher
'82': Al_Cardenas
'83': Al_Davis
'84': Al_Gore
'85': Al_Leiter
'86': Al_Pacino
'87': Al_Sharpton
'88': Alain_Cervantes
'89': Alain_Ducasse
'90': Alan_Ball
'91': Alan_Dershowitz
'92': Alan_Dreher
'93': Alan_Greenspan
'94': Alan_Greer
'95': Alan_Jackson
'96': Alan_Mulally
'97': Alan_Stonecipher
'98': Alan_Tang_Kwong-wing
'99': Alan_Trammell
'100': Alan_Zemaitis
'101': Alanis_Morissette
'102': Alanna_Ubach
'103': Alastair_Campbell
'104': Alastair_Johnston
'105': Albaro_Recoba
'106': Albert_Brooks
'107': Albert_Costa
'108': Albert_Montanes
'109': Albert_Pujols
'110': Alberta_Lee
'111': Alberto_Acosta
'112': Alberto_Fujimori
'113': Alberto_Gonzales
'114': Alberto_Ruiz_Gallardon
'115': Alberto_Sordi
'116': Albrecht_Mentz
'117': Aldo_Paredes
'118': Alec_Baldwin
'119': Alecos_Markides
'120': Alejandro_Atchugarry
'121': Alejandro_Avila
'122': Alejandro_Fernandez
'123': Alejandro_Gonzalez_Inarritu
'124': Alejandro_Lembo
'125': Alejandro_Lerner
'126': Alejandro_Lopez
'127': Alejandro_Toledo
'128': Alek_Wek
'129': Aleksander_Kwasniewski
'130': Aleksander_Voloshin
'131': Alessandra_Cerna
'132': Alessandro_Nesta
'133': Alex_Barros
'134': Alex_Cabrera
'135': Alex_Cejka
'136': Alex_Corretja
'137': Alex_Ferguson
'138': Alex_Gonzalez
'139': Alex_Holmes
'140': Alex_King
'141': Alex_Penelas
'142': Alex_Popov
'143': Alex_Sink
'144': Alex_Wallau
'145': Alex_Zanardi
'146': Alexa_Loren
'147': Alexa_Vega
'148': Alexander_Downer
'149': Alexander_Losyukov
'150': Alexander_Lukashenko
'151': Alexander_Payne
'152': Alexander_Rumyantsev
'153': Alexandra_Jackson
'154': Alexandra_Pelosi
'155': Alexandra_Rozovskaya
'156': Alexandra_Spann
'157': Alexandra_Stevenson
'158': Alexandra_Vodjanikova
'159': Alexandre_Daigle
'160': Alexandre_Despatie
'161': Alexandre_Herchcovitch
'162': Alexandre_Vinokourov
'163': Alexis_Bledel
'164': Alexis_Dennisoff
'165': Alfonso_Cuaron
'166': Alfonso_Portillo
'167': Alfonso_Soriano
'168': Alfred_Ford
'169': Alfred_Sant
'170': Alfredo_Moreno
'171': Alfredo_Pena
'172': Alfredo_di_Stefano
'173': Ali_Abbas
'174': Ali_Abdullah_Saleh
'175': Ali_Adbul_Karim_Madani
'176': Ali_Ahmeti
'177': Ali_Bin_Hussein
'178': Ali_Fallahian
'179': Ali_Hammoud
'180': Ali_Khamenei
'181': Ali_Mohammed_Maher
'182': Ali_Naimi
'183': Alice_Fisher
'184': Alicia_Hollowell
'185': Alicia_Keys
'186': Alicia_Molik
'187': Alicia_Silverstone
'188': Alicia_Witt
'189': Alimzhan_Tokhtakhounov
'190': Alina_Kabaeva
'191': Aline_Chretien
'192': Alisha_Richman
'193': Alison_Krauss
'194': Alison_Lohman
'195': Alistair_MacDonald
'196': Allan_Houston
'197': Allan_Kemakeza
'198': Allan_Wagner
'199': Allen_Iverson
'200': Allen_Rock
'201': Allison_Janney
'202': Allison_Searing
'203': Ally_Sheedy
'204': Allyson_Felix
'205': Alma_Powell
'206': Almeida_Baptista
'207': Alonzo_Mourning
'208': Alvaro_Noboa
'209': Alvaro_Silva_Calderon
'210': Alvaro_Uribe
'211': Aly_Wagner
'212': Alyse_Beaupre
'213': Alyson_Hannigan
'214': Amanda_Beard
'215': Amanda_Bynes
'216': Amanda_Coetzer
'217': Amanda_Marsh
'218': Amanda_Plumer
'219': Amber_Frey
'220': Amber_Tamblyn
'221': Ambrose_Lee
'222': Amelia_Vega
'223': Amelie_Mauresmo
'224': Amer_al-Saadi
'225': Amporn_Falise
'226': Amr_Moussa
'227': Amram_Mitzna
'228': Amy_Brenneman
'229': Amy_Cotton
'230': Amy_Gale
'231': Amy_Pascal
'232': Amy_Redford
'233': Amy_Smart
'234': Amy_Yasbeck
'235': AnFernce_Negron
'236': Ana_Claudia_Talancon
'237': Ana_Guevara
'238': Ana_Isabel_Sanchez
'239': Ana_Palacio
'240': Ana_Paula_Gerard
'241': Ana_Sebastiao
'242': Anastasia_Kelesidou
'243': Anastasia_Myskina
'244': Anatoliy_Kinakh
'245': Anders_Ebbeson
'246': Anders_Fogh_Rasmussen
'247': Anderson_Varejao
'248': Andre_Agassi
'249': Andre_Bucher
'250': Andre_Lange
'251': Andre_Smith
'252': Andre_Techine
'253': Andrea_Bocelli
'254': Andrea_De_Cruz
'255': Andrea_Kiser
'256': Andrea_Yates
'257': Andreas_Vinciguerra
'258': Andrei_Konchalovsky
'259': Andrei_Mikhnevich
'260': Andrei_Nikolishin
'261': Andres_DAlessandro
'262': Andres_Manuel_Lopez_Obrador
'263': Andres_Pastrana
'264': Andrew_Bernard
'265': Andrew_Bunner
'266': Andrew_Caldecott
'267': Andrew_Cuomo
'268': Andrew_Fastow
'269': Andrew_Firestone
'270': Andrew_Gilligan
'271': Andrew_Jarecki
'272': Andrew_Luster
'273': Andrew_Niccol
'274': Andrew_Sabey
'275': Andrew_Shutley
'276': Andrew_Weissmann
'277': Andrew_Wetzler
'278': Andrzej_Tyszkiewicz
'279': Andy_Benes
'280': Andy_Bryant
'281': Andy_Dick
'282': Andy_Garcia
'283': Andy_Graves
'284': Andy_Griffith
'285': Andy_Griggs
'286': Andy_Hebb
'287': Andy_Lau
'288': Andy_Madikians
'289': Andy_North
'290': Andy_Perez
'291': Andy_Roddick
'292': Andy_Rooney
'293': Andy_Warhol
'294': Andy_Wisecarver
'295': Anette_Hosoi
'296': Angel_Lockward
'297': Angel_Maza
'298': Angela_Alvarado_Rosa
'299': Angela_Bassett
'300': Angela_Lansbury
'301': Angela_Mascia-Frye
'302': Angela_Merkel
'303': Angelica_Romero
'304': Angelina_Jolie
'305': Angelo_Genova
'306': Angelo_Reyes
'307': Angie_Arzola
'308': Angie_Martinez
'309': Anibal_Ibarra
'310': Anil_Ramsook
'311': Anita_DeFrantz
'312': Anja_Paerson
'313': Anjum_Hussain
'314': Ann_Godbehere
'315': Ann_Landers
'316': Ann_Morgan
'317': Ann_Veneman
'318': Anna_Chicherova
'319': Anna_Faris
'320': Anna_Jones
'321': Anna_Kournikova
'322': Anna_Nicole_Smith
'323': Anne_Cavers
'324': Anne_Donovan
'325': Anne_Heche
'326': Anne_Krueger
'327': Anne_McLellan
'328': Anne_ONeil
'329': Anneli_Jaatteenmaki
'330': Annette_Bening
'331': Annette_Lu
'332': Annie-Jeanne_Reynaud
'333': Annie_Chaplin
'334': Annie_Machon
'335': Annika_Sorenstam
'336': Antanas_Valionis
'337': Anthony_Carter
'338': Anthony_Corso
'339': Anthony_Ervin
'340': Anthony_Fauci
'341': Anthony_Garotinho
'342': Anthony_Hazen
'343': Anthony_Hopkins
'344': Anthony_LaPaglia
'345': Anthony_Lee_Johnson
'346': Anthony_Mazur
'347': Anthony_Pico
'348': Anthony_Pisciotti
'349': Anthony_Principi
'350': Anthony_Rackauckas
'351': Anthony_Scott_Miller
'352': Antje_Buschschulte
'353': Anton_Balasingham
'354': Antonio_Banderas
'355': Antonio_Bernardo
'356': Antonio_Cassano
'357': Antonio_Catania
'358': Antonio_Elias_Saca
'359': Antonio_Palocci
'360': Antonio_Trillanes
'361': Antony_Leung
'362': Antwun_Echols
'363': Anwar_Ibrahim
'364': Anzori_Kikalishvili
'365': Aparna_Pillai
'366': Aram_Adler
'367': Arantxa_Sanchez-Vicario
'368': Aretha_Franklin
'369': Ari_Bousbib
'370': Ari_Fleischer
'371': Arianna_Huffington
'372': Arie_Haan
'373': Ariel_Sharon
'374': Arif_Mardin
'375': Arlen_Specter
'376': Armand_Sargen
'377': Armando_Avila_Panchame
'378': Armando_Calderon_Sol
'379': Armando_Carrillo
'380': Arminio_Fraga
'381': Arnaud_Clement
'382': Arnaud_Lagardere
'383': Arnie_Boehm
'384': Arnold_Palmer
'385': Arnold_Schwarzenegger
'386': Arnold_Scott
'387': Arnoldo_Aleman
'388': Aron_Ralston
'389': Arsinee_Khanjian
'390': Art_Cooper
'391': Art_Hoffmann
'392': Art_Howe
'393': Art_Lopez
'394': Arthur_Johnson
'395': Arthur_Martinez
'396': Artieas_Shanks
'397': Arturo_Gatti
'398': Arye_Mekel
'399': Asa_Hutchinson
'400': Ascencion_Barajas
'401': Ashanti
'402': Ashlea_Talbot
'403': Ashley_Judd
'404': Ashley_Olsen
'405': Ashley_Postell
'406': Ashraf_Alasmar
'407': Ashraf_Ghani
'408': Ashton_Kutcher
'409': Asif_Ali_Zardari
'410': Asif_Hanif
'411': Askar_Akayev
'412': Asmaa_Assad
'413': Assad_Ahmadi
'414': Astou_Ndiaye-Diatta
'415': Astrid_Betancourt
'416': Astrid_Eyzaguirre
'417': Atal_Bihari_Vajpayee
'418': Ataollah_Mohajerani
'419': Atiabet_Ijan_Amabel
'420': Atom_Egoyan
'421': Atsushi_Sato
'422': Audrey_Lacroix
'423': Audrey_Sauret
'424': Augustin_Calleri
'425': Augusto_Pinochet
'426': Augusto_Roa_Bastos
'427': Aung_San_Suu_Kyi
'428': Austin_Kearns
'429': Avril_Lavigne
'430': Azmi_Bishara
'431': Azra_Akin
'432': BB_King
'433': BJ_Habibie
'434': Babe_Ruth
'435': Baburam_Bhattari
'436': Bak_Chang-Ryun
'437': Barbara_Bach
'438': Barbara_Becker
'439': Barbara_Bodine
'440': Barbara_Boxer
'441': Barbara_Brezigar
'442': Barbara_De_Brun
'443': Barbara_Esbin
'444': Barbara_Felt-Miller
'445': Barbara_Roberts
'446': Barbara_Walters
'447': Barbora_Strycova
'448': Barbra_Streisand
'449': Barrett_Jackman
'450': Barry_Alvarez
'451': Barry_Bonds
'452': Barry_Collier
'453': Barry_Diller
'454': Barry_Ford
'455': Barry_Hinson
'456': Barry_Nakell
'457': Barry_Switzer
'458': Barry_Williams
'459': Barry_Zito
'460': Bart_Freundlich
'461': Bart_Hendricks
'462': Bartosz_Kizierowski
'463': Barzan_al-Tikriti
'464': Basdeo_Panday
'465': Bashar_Assad
'466': Baz_Luhrmann
'467': Beatrice_Dalle
'468': Beatriz_Merino
'469': Beecher_Ray_Kirby
'470': Begum_Khaleda_Zia
'471': Bela_Karolyi
'472': Ben_Affleck
'473': Ben_Betts
'474': Ben_Braun
'475': Ben_Broussard
'476': Ben_Cahoon
'477': Ben_Chandler
'478': Ben_Cohen
'479': Ben_Curtis
'480': Ben_Davis
'481': Ben_Glisan
'482': Ben_Howland
'483': Ben_Kingsley
'484': Ben_Lee
'485': Ben_Stein
'486': Ben_Wallace
'487': Benazir_Bhutto
'488': Benedita_da_Silva
'489': Benicio_Del_Toro
'490': Benito_Santiago
'491': Benjamin_Bratt
'492': Benjamin_Franklin
'493': Benjamin_Martinez
'494': Benjamin_McKenzie
'495': Benjamin_Netanyahu
'496': Benjamin_Neulander
'497': Bernadette_Peters
'498': Bernard_Ebbers
'499': Bernard_Giraudeau
'500': Bernard_Kerik
'501': Bernard_Landry
'502': Bernard_Law
'503': Bernard_Lord
'504': Bernard_Siegel
'505': Bernardo_Segura
'506': Bernice_Wong
'507': Bertie_Ahern
'508': Bertrand_Bonello
'509': Bertrand_Delanoe
'510': Beth_Blough
'511': Beth_Jones
'512': Betsy_Coffin
'513': Betsy_Smith
'514': Bettina_Rheims
'515': Betty_Garrison
'516': Betty_Williams
'517': Beyonce_Knowles
'518': Bianca_Jagger
'519': Bijan_Darvish
'520': Bijan_Namdar_Zangeneh
'521': Bilal_Erdogan
'522': Biljana_Plavsic
'523': Bill_Belichick
'524': Bill_Bradley
'525': Bill_Butler
'526': Bill_Byrne
'527': Bill_Callahan
'528': Bill_Carmody
'529': Bill_Cartwright
'530': Bill_Clancy
'531': Bill_Clinton
'532': Bill_Curry
'533': Bill_Doba
'534': Bill_Duffey
'535': Bill_Elliott
'536': Bill_Fennelly
'537': Bill_Frist
'538': Bill_Gates
'539': Bill_Graham
'540': Bill_Guerin
'541': Bill_Herrion
'542': Bill_Hughes
'543': Bill_King
'544': Bill_Kollar
'545': Bill_Kong
'546': Bill_Lerach
'547': Bill_Maher
'548': Bill_Mauldin
'549': Bill_McBride
'550': Bill_Nelson
'551': Bill_OReilly
'552': Bill_Parcells
'553': Bill_Parsons
'554': Bill_Paxton
'555': Bill_Pryor
'556': Bill_Rainer
'557': Bill_Readdy
'558': Bill_Richardson
'559': Bill_Self
'560': Bill_Simon
'561': Bill_Sizemore
'562': Bill_Stapleton
'563': Bill_Stein
'564': Bill_Walton
'565': Billy_Andrade
'566': Billy_Beane
'567': Billy_Bob_Thornton
'568': Billy_Boyd
'569': Billy_Crawford
'570': Billy_Crystal
'571': Billy_Donovan
'572': Billy_Edelin
'573': Billy_Gilman
'574': Billy_Graham
'575': Billy_Joel
'576': Billy_Rork
'577': Billy_Sollie
'578': Billy_Tibbets
'579': Bing_Crosby
'580': Binyamin_Ben-Eliezer
'581': Bison_Dele
'582': Bixente_LIzarazu
'583': Blas_Ople
'584': Blythe_Danner
'585': Blythe_Hartley
'586': Bo_Pelini
'587': Bo_Ryan
'588': Bob_Alper
'589': Bob_Beauprez
'590': Bob_Bowlsby
'591': Bob_Cantrell
'592': Bob_Colvin
'593': Bob_Crippen
'594': Bob_Curtis
'595': Bob_Dole
'596': Bob_Eskridge
'597': Bob_Ferguson
'598': Bob_Geldof
'599': Bob_Goldman
'600': Bob_Graham
'601': Bob_Guccione
'602': Bob_Hartley
'603': Bob_Hayes
'604': Bob_Herz
'605': Bob_Holden
'606': Bob_Hope
'607': Bob_Huggins
'608': Bob_Iger
'609': Bob_Krueger
'610': Bob_Melvin
'611': Bob_Menendez
'612': Bob_Newhart
'613': Bob_Petrino
'614': Bob_Riley
'615': Bob_Stoops
'616': Bob_Sulkin
'617': Bob_Taft
'618': Bob_Wright
'619': Bobby_Bowden
'620': Bobby_Goldwater
'621': Bobby_Jackson
'622': Bobby_Kielty
'623': Bobby_Robson
'624': Bobo_Balde
'625': Bode_Miller
'626': Bonnie_Fuller
'627': Bonnie_Hunt
'628': Bono
'629': Boris_Becker
'630': Boris_Berezovsky
'631': Boris_Henry
'632': Boris_Jordan
'633': Boris_Trajkovski
'634': Boris_Yeltsin
'635': Boutros_Boutros_Ghali
'636': Brad_Alexander_Smith
'637': Brad_Banks
'638': Brad_Brownell
'639': Brad_Garrett
'640': Brad_Gushue
'641': Brad_Johnson
'642': Brad_Miller
'643': Brad_Pitt
'644': Brad_Russ
'645': Brad_Smith
'646': Brad_Wilk
'647': Brady_Rodgers
'648': Brajesh_Mishra
'649': Brandon_Boyd
'650': Brandon_Fails
'651': Brandon_Hammond
'652': Brandon_Inge
'653': Brandon_Jones
'654': Brandon_Knight
'655': Brandon_Larson
'656': Brandon_Lloyd
'657': Brandon_Robinson
'658': Brandon_Spann
'659': Brandon_Webb
'660': Branko_Crvenkovski
'661': Brawley_King
'662': Brenda_Magana
'663': Brenda_Wilson
'664': Brenda_van_Dam
'665': Brendan_Fraser
'666': Brendan_Gaughan
'667': Brendan_Hansen
'668': Brendan_Stai
'669': Brennon_Leighton
'670': Brent_Coles
'671': Brett_Boone
'672': Brett_Hawke
'673': Brett_Hull
'674': Brett_Perry
'675': Brian_Billick
'676': Brian_Campbell
'677': Brian_Cashman
'678': Brian_Clemens
'679': Brian_Cook
'680': Brian_Cowen
'681': Brian_De_Palma
'682': Brian_Florence
'683': Brian_Grazier
'684': Brian_Gregory
'685': Brian_Griese
'686': Brian_Heidik
'687': Brian_Henson
'688': Brian_Jordan
'689': Brian_Kerr
'690': Brian_Lara
'691': Brian_McIntyre
'692': Brian_Meadors
'693': Brian_Mulroney
'694': Brian_Olson
'695': Brian_Pavlich
'696': Brian_Scalabrine
'697': Brian_Schneider
'698': Brian_StPierre
'699': Brian_Van_Dusen
'700': Brian_Weaver
'701': Brian_Wells
'702': Brian_Williams
'703': Bridget_Fonda
'704': Bridgette_Wilson-Sampras
'705': Brigitte_Boisselier
'706': Britney_Spears
'707': Brittany_Snow
'708': Brock_Berlin
'709': Bronson_Arroyo
'710': Brook_Robinson
'711': Brooke_Adams
'712': Brooke_Gordon
'713': Brooke_Shields
'714': Bruce_Arena
'715': Bruce_Gebhardt
'716': Bruce_Lunsford
'717': Bruce_Paltrow
'718': Bruce_Springsteen
'719': Bruce_Van_De_Velde
'720': Bruce_Weber
'721': Bruce_Willis
'722': Bruna_Colosio
'723': Bruno_Junquiera
'724': Bryan_Adams
'725': Bryan_Chui
'726': Bryan_Cooley
'727': Bryan_Murray
'728': Bryan_Thomas
'729': Bryant_Young
'730': Bryce_Carmine
'731': Buck_Rodgers
'732': Bud_Selig
'733': Budd_Schulberg
'734': Buddy_Ryan
'735': Buford_Blount
'736': Bulent_Ecevit
'737': Bustam_A_Zedan_Aljanabi
'738': Butch_Davis
'739': Buzz_Hargrove
'740': Byron_Scott
'741': Cabas
'742': Caio_Blat
'743': Calbert_Cheaney
'744': Calista_Flockhart
'745': Calvin_Harrison
'746': Calvin_Joseph_Coleman
'747': Cameron_Diaz
'748': Camilla_Parker_Bowles
'749': Camille_Colvin
'750': Camille_Lewis
'751': Camryn_Manheim
'752': Candace_Sutton
'753': Candice_Beatty
'754': Candice_Bergen
'755': Candie_Kung
'756': Carey_Lowell
'757': Cari_Davis
'758': Carin_Koch
'759': Carina_Lau_Ka-ling
'760': Carl_Levin
'761': Carl_Pope
'762': Carl_Reiner
'763': Carla_Del_Ponte
'764': Carla_Gay_Balingit
'765': Carla_Gugino
'766': Carla_Moreno
'767': Carla_Myers
'768': Carla_Sullivan
'769': Carla_Tricoli
'770': Carlo_Ancelotti
'771': Carlo_Azeglio_Ciampi
'772': Carlos_Alberto
'773': Carlos_Alberto_Parreira
'774': Carlos_Arroyo
'775': Carlos_Barra
'776': Carlos_Barragan
'777': Carlos_Beltran
'778': Carlos_Bianchi
'779': Carlos_De_Abreu
'780': Carlos_Fasciolo
'781': Carlos_Ghosn
'782': Carlos_Iturgaitz
'783': Carlos_Juarez
'784': Carlos_Lordkipanitse
'785': Carlos_Manuel_Pruneda
'786': Carlos_Menem
'787': Carlos_Mesa
'788': Carlos_Moya
'789': Carlos_Ortega
'790': Carlos_Paternina
'791': Carlos_Queiroz
'792': Carlos_Quintanilla_Schmidt
'793': Carlos_Ruckauf
'794': Carlos_Ruiz
'795': Carlos_Salinas
'796': Carlos_Savedra
'797': Carlos_Vives
'798': Carlton_Baugh
'799': Carlton_Dotson
'800': Carly_Fiorina
'801': Carly_Gullickson
'802': Carmen_Electra
'803': Carol_Burnett
'804': Carol_Carmody
'805': Carol_Moseley_Braun
'806': Carol_Niedermayer
'807': Carol_Williams
'808': Carolina_Barco
'809': Carolina_Kluft
'810': Carolina_Moraes
'811': Caroline_Dhavernas
'812': Caroline_Kennedy
'813': Caroline_Link
'814': Carolyn_Dawn_Johnson
'815': Carolyn_Kuhl
'816': Carrie-Anne_Moss
'817': Carroll_Weimer
'818': Carson_Daly
'819': Carson_Palmer
'820': Casey_Crowder
'821': Casey_Mears
'822': Cass_Ballenger
'823': Cassandra_Heise
'824': Casy_Preslar
'825': Cate_Blanchett
'826': Catherine_Bell
'827': Catherine_Deneuve
'828': Catherine_Donkers
'829': Catherine_Ndereba
'830': Catherine_Woodard
'831': Catherine_Zeta-Jones
'832': Cathryn_Crawford
'833': Cathy_Chisholm
'834': Cathy_Cunningham
'835': Cathy_Freeman
'836': Catriona_Le_May_Doan
'837': Cecile_de_France
'838': Cecilia_Bolocco
'839': Cecilia_Chang
'840': Cecilia_Cheung
'841': Cedric_Benson
'842': Celia_Cruz
'843': Celine_Dion
'844': Celso_Amorim
'845': Celso_Lafer
'846': Cemil_Cicek
'847': Cesar_Gaviria
'848': Cesar_Maia
'849': Cha_Yung-gu
'850': Chadha_Gurinder
'851': Chakib_Khelil
'852': Chan_Choi
'853': Chan_Gailey
'854': Chan_Ho_Park
'855': Chance_Mock
'856': Chanda_Rubin
'857': Chandrika_Kumaratunga
'858': Chang_Dae-whan
'859': Chang_Jae_On
'860': Chang_Saio-yue
'861': Chang_Sang
'862': Chang_Tso
'863': Chante_Jawan_Mallard
'864': Charla_Moye
'865': Charlene_Barshefsky
'866': Charles_Bell
'867': Charles_Bronson
'868': Charles_Chandler_IV
'869': Charles_Cope
'870': Charles_Grassley
'871': Charles_Holzner
'872': Charles_Ingram
'873': Charles_Kartman
'874': Charles_Lebois
'875': Charles_Mathews
'876': Charles_Moose
'877': Charles_Pickering
'878': Charles_Pouty
'879': Charles_Richardson
'880': Charles_Rogers
'881': Charles_Schumer
'882': Charles_Tannok
'883': Charles_Taylor
'884': Charley_Armey
'885': Charlie_Coles
'886': Charlie_Deane
'887': Charlie_Garner
'888': Charlie_Hunnam
'889': Charlie_Sheen
'890': Charlie_Williams
'891': Charlie_Zaa
'892': Charlize_Theron
'893': Charlotte_Casiraghi
'894': Charlotte_Chambers
'895': Charlotte_Church
'896': Charlotte_Rampling
'897': Charlton_Heston
'898': Charmaine_Crooks
'899': Chawki_Armali
'900': Chea_Sophara
'901': Chelsea_Clinton
'902': Chen_Kaige
'903': Chen_Liang_Yu
'904': Chen_Shui-bian
'905': Chen_Tsai-chin
'906': Cherie_Blair
'907': Cherry_Jones
'908': Cheryl_Ford
'909': Cheryl_Hines
'910': Cheryl_James
'911': Cheryl_Little
'912': Cheryl_Tiegs
'913': Chhouk_Rin
'914': Chick_Hearn
'915': Chin-Feng_Chen
'916': Chin-Hui_Tsao
'917': Chip_Burrus
'918': Chip_Ganassi
'919': Chip_Knight
'920': Chistian_Stahl
'921': Chita_Rivera
'922': Chloe_Sevigny
'923': Cho_Myung-kyun
'924': Choi_Sung-hong
'925': Choi_Yun-yong
'926': Chok_Tong_Goh
'927': Chris_Andrews
'928': Chris_Bell
'929': Chris_Byrd
'930': Chris_Cirino
'931': Chris_Claiborne
'932': Chris_Columbus
'933': Chris_Cookson
'934': Chris_Cooper
'935': Chris_Cornell
'936': Chris_Crocker
'937': Chris_Dodd
'938': Chris_Forsyth
'939': Chris_Gratton
'940': Chris_Hernandez
'941': Chris_Klein
'942': Chris_Kolanas
'943': Chris_Matthews
'944': Chris_Moore
'945': Chris_Neil
'946': Chris_Noth
'947': Chris_Penn
'948': Chris_Pronger
'949': Chris_Reitsma
'950': Chris_Rock
'951': Chris_Simon
'952': Chris_Swecker
'953': Chris_Terry
'954': Chris_Thomas
'955': Chris_Tucker
'956': Chris_Whitney
'957': Christian_Bale
'958': Christian_Fittipaldi
'959': Christian_Gimenez
'960': Christian_Lacroix
'961': Christian_Lirette
'962': Christian_Longo
'963': Christian_Malcolm
'964': Christian_Olsson
'965': Christian_Patino
'966': Christian_Von_Wernich
'967': Christian_Wulff
'968': Christiane_Wulff
'969': Christina_Aguilera
'970': Christina_Sawaya
'971': Christine_Arron
'972': Christine_Baumgartner
'973': Christine_Ebersole
'974': Christine_Gregoire
'975': Christine_Rau
'976': Christine_Todd_Whitman
'977': Christoph_Daum
'978': Christopher_Amolsch
'979': Christopher_Conyers
'980': Christopher_Matero
'981': Christopher_Patten
'982': Christopher_Reeve
'983': Christopher_Russell
'984': Christopher_Speer
'985': Christopher_Walken
'986': Christopher_Whittle
'987': Christy_Ferer
'988': Christy_Turlington
'989': Chuanyun_Li
'990': Chuck_Amato
'991': Chuck_Bednarik
'992': Chuck_Eidson
'993': Chuck_Finley
'994': Chuck_Hagel
'995': Chuck_Woolery
'996': Chuck_Yeager
'997': Chung_Mong-hun
'998': Chung_Mong-joon
'999': Chyung_Dai-chul
'1000': Ciaran_Hinds
'1001': Cindy_Crawford
'1002': Cindy_Klassen
'1003': Cindy_Margolis
'1004': Cindy_Moll
'1005': Cindy_Taylor
'1006': Cindy_Zagorski
'1007': Ciro_Gomes
'1008': Claire_Danes
'1009': Claire_De_Gryse
'1010': Claire_Hentzen
'1011': Claire_Leger
'1012': Claire_Tomalin
'1013': Clara_Harris
'1014': Clare_Latimer
'1015': Clare_Short
'1016': Clark_Randt
'1017': Claude_Jorda
'1018': Claudette_Robinson
'1019': Claudia_Cardinale
'1020': Claudia_Coslovich
'1021': Claudia_Pechstein
'1022': Claudia_Schiffer
'1023': Claudine_Farrell
'1024': Claudio_Abbado
'1025': Claudio_Lopez
'1026': Claudio_Ranieri
'1027': Clay_Aiken
'1028': Clay_Campbell
'1029': Clemente_de_la_Vega
'1030': Cliff_Ellis
'1031': Clifford_Etienne
'1032': Clifford_Robinson
'1033': Clint_Eastwood
'1034': Clint_Howard
'1035': Clint_Lamebear
'1036': Clive_Lloyd
'1037': Clive_Woodward
'1038': Coco_dEste
'1039': Cole_Chapman
'1040': Coleen_Rowley
'1041': Colin_Campbell
'1042': Colin_Cowie
'1043': Colin_Farrell
'1044': Colin_Jackson
'1045': Colin_Montgomerie
'1046': Colin_Phillips
'1047': Colin_Powell
'1048': Colin_Prescot
'1049': Colleen_Atwood
'1050': Colleen_Donovan
'1051': Colleen_Jones
'1052': Colleen_OClair
'1053': Colleen_Ryan
'1054': Collis_Temple_III
'1055': Columba_Bush
'1056': Compay_Segundo
'1057': Conan_OBrien
'1058': Conchita_Martinez
'1059': Condoleezza_Rice
'1060': Connie_Chung
'1061': Connie_Freydell
'1062': Conrad_Black
'1063': Constance_Marie
'1064': Cora_Cambell
'1065': Coretta_Scott_King
'1066': Corey_Maggette
'1067': Cori_Enghusen
'1068': Corinna_Harfouch
'1069': Corinne_Coman
'1070': Corliss_Williamson
'1071': Cosmo_Iacavazzi
'1072': Costas_Simitis
'1073': Courtney_Cox
'1074': Courtney_Love
'1075': Craig_Burley
'1076': Craig_David
'1077': Craig_Doblin
'1078': Craig_Fitzgibbon
'1079': Craig_MacTavish
'1080': Craig_Morgan
'1081': Craig_OClair
'1082': Craig_Wilson
'1083': Crandall_Bowles
'1084': Crispin_Glover
'1085': Cristian_Barros
'1086': Cristiano_da_Matta
'1087': Cristina_Fernandez
'1088': Cristina_Kirchner
'1089': Cristina_Saralegui
'1090': Cristina_Torrens_Valero
'1091': Cruz_Bustamante
'1092': Cuba_Gooding_Jr
'1093': Curt_Weldon
'1094': Curtis_Joseph
'1095': Curtis_Rodriguez
'1096': Curtis_Strange
'1097': Cyndi_Thompson
'1098': Cynthia_Nixon
'1099': Cynthia_Rowley
'1100': DAngelo_Jimenez
'1101': Dagmar_Dunlevy
'1102': Dai_Bachtiar
'1103': Dai_Chul_Chyung
'1104': Daisy_Fuentes
'1105': Daja_Bedanova
'1106': Dalai_Lama
'1107': Dale_Bosworth
'1108': Dale_Earnhardt
'1109': Dale_Earnhardt_Jr
'1110': Dalia_Rabin-Pelosoff
'1111': Damarius_Bilbo
'1112': Damon_Dash
'1113': Damon_Stoudamire
'1114': Damon_van_Dam
'1115': Dan_Ackroyd
'1116': Dan_Bartlett
'1117': Dan_Boyle
'1118': Dan_Bylsma
'1119': Dan_Dickau
'1120': Dan_Duquette
'1121': Dan_Guerrero
'1122': Dan_Kellner
'1123': Dan_LaCoutre
'1124': Dan_Monson
'1125': Dan_Morales
'1126': Dan_Prinster
'1127': Dan_Quayle
'1128': Dan_Reeves
'1129': Dan_Snyder
'1130': Dan_Wheldon
'1131': Dana_Vollmer
'1132': Daniel_Barenboim
'1133': Daniel_Bruehl
'1134': Daniel_Chin
'1135': Daniel_Coats
'1136': Daniel_Comisso_Urdaneta
'1137': Daniel_Darnell
'1138': Daniel_Day-Lewis
'1139': Daniel_Kurtzer
'1140': Daniel_Montenegro
'1141': Daniel_Montgomery
'1142': Daniel_Ortega
'1143': Daniel_Osorno
'1144': Daniel_Patrick_Moynihan
'1145': Daniel_Pearl
'1146': Daniel_Radcliffe
'1147': Daniel_Rouse
'1148': Daniel_Scioli
'1149': Daniel_Zelman
'1150': Daniela_Cicarelli
'1151': Daniela_Hantuchova
'1152': Daniele_Bergamin
'1153': Daniele_Hypolito
'1154': Daniele_Nardello
'1155': Daniell_Sunjata
'1156': Danielle_Spencer
'1157': Danis_Tanovic
'1158': Danny_Ainge
'1159': Danny_Avalon
'1160': Danny_Elfman
'1161': Danny_Glover
'1162': Danny_Green
'1163': Danny_Morgan
'1164': Dany_Heatley
'1165': Darcy_Regier
'1166': Darin_Erstad
'1167': Dario_Camuffo
'1168': Dario_Franchitti
'1169': Dariusz_Michalczewski
'1170': Darko_Milicic
'1171': Darla_Moore
'1172': Darlene_Garrettson
'1173': Darrell_Dickey
'1174': Darrell_Issa
'1175': Darrell_Porter
'1176': Darrell_Royal
'1177': Darren_Campel
'1178': Darren_Clarke
'1179': Darryl_McDaniels
'1180': Darryl_Stingley
'1181': Darvis_Patton
'1182': Daryl_Hannah
'1183': Daryl_Jones
'1184': Daryl_Parks
'1185': Daryl_Sabara
'1186': Daryl_Smith
'1187': Dave_Barr
'1188': Dave_Campo
'1189': Dave_Johnson
'1190': Dave_Lewis
'1191': Dave_Matthews
'1192': Dave_McGinnis
'1193': Dave_McNally
'1194': Dave_McNealey
'1195': Dave_Odom
'1196': Dave_Potter
'1197': Dave_Ragone
'1198': Dave_Robertson
'1199': Dave_Tucker
'1200': Dave_Wannstedt
'1201': Dave_Williams
'1202': Davey_Johnson
'1203': David_Alpay
'1204': David_Anderson
'1205': David_Arquette
'1206': David_Ballantyne
'1207': David_Beckham
'1208': David_Bell
'1209': David_Bisbal
'1210': David_Blaine
'1211': David_Bowie
'1212': David_Braley
'1213': David_Brent
'1214': David_Brinkley
'1215': David_Brown
'1216': David_Canary
'1217': David_Caraway
'1218': David_Carradine
'1219': David_Caruso
'1220': David_Chase
'1221': David_Collenette
'1222': David_Coulthard
'1223': David_Dewayne_Williams
'1224': David_Dewhurst
'1225': David_Dodge
'1226': David_Donohue
'1227': David_Dorfman
'1228': David_Duke
'1229': David_Duval
'1230': David_Eldon
'1231': David_Gest
'1232': David_Glenn
'1233': David_Hannay
'1234': David_Hanson
'1235': David_Hasselhoff
'1236': David_Heyman
'1237': David_Heymann
'1238': David_Hilt
'1239': David_Ho
'1240': David_Howard
'1241': David_Hyde_Pierce
'1242': David_Kelley
'1243': David_Kelly
'1244': David_Leahy
'1245': David_McCallum
'1246': David_McCullough
'1247': David_McKiernan
'1248': David_Millar
'1249': David_Modell
'1250': David_Montoya
'1251': David_Myers
'1252': David_Nalbandian
'1253': David_Obey
'1254': David_Oh
'1255': David_Provost
'1256': David_Przybyszewski
'1257': David_Rivkin_Jr
'1258': David_Scott_Morris
'1259': David_Shayler
'1260': David_Sibleyk
'1261': David_Siegel
'1262': David_Sousa
'1263': David_Spade
'1264': David_Stern
'1265': David_Suazo
'1266': David_Surrett
'1267': David_Tornberg
'1268': David_Trimble
'1269': David_Welch
'1270': David_Wells
'1271': David_Westerfield
'1272': David_Wolf
'1273': David_Zeplowitz
'1274': Davis_Love_III
'1275': Dawn_Staley
'1276': Dawna_LoPiccolo
'1277': Dean_Barker
'1278': Dean_Barkley
'1279': Dean_Jacek
'1280': Dean_Sheremet
'1281': Deb_Santos
'1282': Debbie_Allen
'1283': Debbie_Reynolds
'1284': Debra_Brown
'1285': Debra_Messing
'1286': Debra_Rose
'1287': Debra_Shank
'1288': Debra_Yang
'1289': Deece_Eckstein
'1290': Deena_Burnett
'1291': Deepa_Mehta
'1292': Della_Clara
'1293': Delphine_Chuillot
'1294': Demetrin_Veal
'1295': Demetrius_Ferraciu
'1296': Demi_Moore
'1297': Denis_Coderre
'1298': Denis_Fassou-Nguesso
'1299': Denise_Johnson
'1300': Denise_Locke
'1301': Denise_van_Outen
'1302': Deniz_Baykal
'1303': Dennis_Archer
'1304': Dennis_Erickson
'1305': Dennis_Franchione
'1306': Dennis_Hastert
'1307': Dennis_Johnson
'1308': Dennis_Kozlowski
'1309': Dennis_Kucinich
'1310': Dennis_Miller
'1311': Dennis_Oswald
'1312': Dennis_Powell
'1313': Denys_Arcand
'1314': Denzel_Washington
'1315': Dereck_Whittenburg
'1316': Derek_Abney
'1317': Derek_Bond
'1318': Derek_Jeter
'1319': Derek_King
'1320': Derek_Lowe
'1321': Derek_Parra
'1322': Derian_Hatcher
'1323': Derrick_Battie
'1324': Derrick_Rodgers
'1325': Derrick_Taylor
'1326': Des_Brown
'1327': Desiree_Lemosi
'1328': Desiree_McKenzie
'1329': Desmon_Farmer
'1330': Devin_Harris
'1331': Dewayne_White
'1332': Dexter_Jackson
'1333': Diana_Krall
'1334': Diana_Munz
'1335': Diana_Renee_Valdivieso_Dubon
'1336': Diana_Ross
'1337': Diana_Silvius
'1338': Diana_Taurasi
'1339': Diana_Taylor
'1340': Diane_Green
'1341': Diane_Ladd
'1342': Diane_Lane
'1343': Dianne_Feinstein
'1344': Dianne_Reeves
'1345': Dick_Armey
'1346': Dick_Bennett
'1347': Dick_Cheney
'1348': Dick_Clark
'1349': Dick_Devine
'1350': Dick_Jauron
'1351': Dick_Latessa
'1352': Dick_Posthumus
'1353': Dick_Smothers
'1354': Dick_Vermeil
'1355': Didier_Defago
'1356': Diego_Armando_Maradona
'1357': Diego_Colorado
'1358': Diego_Diego_Lerman
'1359': Dieter_Holzer
'1360': Dieter_Zetsche
'1361': Dimitar_Berbatov
'1362': Dimitri_Perricos
'1363': Din_Samsudin
'1364': Dinah_Turner
'1365': Dino_Risi
'1366': Dino_de_Laurentis
'1367': Dinora_Rosales
'1368': Dion_Glover
'1369': Dionigi_Tettamanzi
'1370': Dionne_Warwick
'1371': Dionyssis_Georgiadis
'1372': Dirk_Kempthorne
'1373': Dita_Von_Tesse
'1374': Djabir_Said-Guerni
'1375': Doc_Rivers
'1376': Dolly_Parton
'1377': Dolma_Tsering
'1378': Dominic_Monaghan
'1379': Dominick_Dunne
'1380': Dominik_Garcia-Lorido
'1381': Dominik_Hrbaty
'1382': Dominique_Perben
'1383': Dominique_de_Villepin
'1384': Don_Boudria
'1385': Don_Carcieri
'1386': Don_Flanagan
'1387': Don_Henley
'1388': Don_Hewitt
'1389': Don_King
'1390': Don_Lake
'1391': Don_Matthews
'1392': Don_Meredith
'1393': Don_Nickles
'1394': Don_Siegelman
'1395': Donald_Anderson
'1396': Donald_Carty
'1397': Donald_Evans
'1398': Donald_Fehr
'1399': Donald_Hays
'1400': Donald_Keck
'1401': Donald_Keyser
'1402': Donald_Pettit
'1403': Donald_Regan
'1404': Donald_Rumsfeld
'1405': Donald_Trump
'1406': Donatella_Versace
'1407': Donna_Barrera
'1408': Donna_Brazile
'1409': Donna_Morrissey
'1410': Donna_Ralston
'1411': Donna_Shalala
'1412': Donna_Walker
'1413': Donnie_Brennan
'1414': Donny_Osmond
'1415': Dora_Bakoyianni
'1416': Doris_Roberts
'1417': Doris_Schroeder
'1418': Dorothy_Lamour
'1419': Dorothy_Loudon
'1420': Dorothy_Wilson
'1421': Dorthy_Moxley
'1422': Dot_Helms
'1423': Doug_Christie
'1424': Doug_Collins
'1425': Doug_Duncan
'1426': Doug_Melvin
'1427': Doug_Moe
'1428': Doug_Racine
'1429': Doug_Wilson
'1430': Douglas_Faneuil
'1431': Douglas_Gansler
'1432': Douglas_Meester
'1433': Douglas_Paal
'1434': Dragan_Covic
'1435': Drew_Barrymore
'1436': Drew_Bledsoe
'1437': Drew_Gooden
'1438': Du_Qinglin
'1439': Duane_Barber
'1440': Duane_Lee_Chapman
'1441': Dudley_Rogers
'1442': Dule_Hill
'1443': Duncan_Fletcher
'1444': Dunn_Lampton
'1445': Dustan_Mohr
'1446': Dustin_Brown
'1447': Dustin_Hoffman
'1448': Dusty_Baker
'1449': Dwain_Kyles
'1450': Dwayne_Johnson
'1451': Dwayne_Wade
'1452': Dwayne_Williams
'1453': Dyab_Abou_Jahjah
'1454': Dyana_Calub
'1455': E_Clay_Shaw
'1456': Earl_Campbell
'1457': Earl_Counter
'1458': Earl_Fritts
'1459': Earl_Scruggs
'1460': Ed_Book
'1461': Ed_Case
'1462': Ed_Mekertichian
'1463': Ed_Rendell
'1464': Ed_Rosenthal
'1465': Ed_Smart
'1466': Ed_Sullivan
'1467': Ed_Wade
'1468': Eddie_Compass
'1469': Eddie_Fenech_Adami
'1470': Eddie_Jordan
'1471': Eddie_Lewis
'1472': Eddie_Lucio
'1473': Eddie_Murray
'1474': Eddie_Sutton
'1475': Eddy_Hartenstein
'1476': Eddy_Merckx
'1477': Edgar_Savisaar
'1478': Edie_Falco
'1479': Edina_Batar
'1480': Edith_Masai
'1481': Edmund_Hillary
'1482': Edmund_Stoiber
'1483': Edouard_Michelin
'1484': Eduard_Limonov
'1485': Eduard_Shevardnadze
'1486': Eduardo_Chillida
'1487': Eduardo_Duhalde
'1488': Eduardo_Fischer
'1489': Eduardo_Romero
'1490': Edward_Albee
'1491': Edward_Arsenault
'1492': Edward_Belvin
'1493': Edward_Burns
'1494': Edward_Egan
'1495': Edward_Flynn
'1496': Edward_Greenspan
'1497': Edward_James_Olmos
'1498': Edward_Johnson
'1499': Edward_Kennedy
'1500': Edward_Lohn
'1501': Edward_Lu
'1502': Edward_Norton
'1503': Edward_Said
'1504': Edward_Seaga
'1505': Edwin_Edwards
'1506': Edwina_Currie
'1507': Efrain_Rios_Montt
'1508': Eglis_Yaima_Cruz
'1509': Eileen_Coparropa
'1510': Eileen_Spina
'1511': Einars_Repse
'1512': Ekaterina_Dmitriev
'1513': Ekke_Hard_Forberg
'1514': El_Hadji_Diouf
'1515': Eladio_Larez
'1516': Elaine_Chao
'1517': Elaine_Stritch
'1518': Elena_Bereznaya
'1519': Elena_Bovina
'1520': Elena_Dementieva
'1521': Elena_Likhovtseva
'1522': Elena_Tihomirova
'1523': Elena_de_Chavez
'1524': Elgin_Baylor
'1525': Eli_Broad
'1526': Eli_Rosenbaum
'1527': Eli_Stutsman
'1528': Eliane_Karp
'1529': Elias_Attallah
'1530': Elijah_Wood
'1531': Elijan_Ingram
'1532': Elin_Nordegren
'1533': Elinor_Caplan
'1534': Eliott_Spitzer
'1535': Elisabeth_Schumacher
'1536': Elisabeth_Welch
'1537': Elisha_Cuthbert
'1538': Eliza_Dushku
'1539': Eliza_Manningham-Buller
'1540': Elizabeth_Berkeley
'1541': Elizabeth_Dole
'1542': Elizabeth_Hill
'1543': Elizabeth_Hurley
'1544': Elizabeth_Pena
'1545': Elizabeth_Regan
'1546': Elizabeth_Shue
'1547': Elizabeth_Smart
'1548': Elizabeth_Taylor
'1549': Ellen_Barkin
'1550': Ellen_DeGeneres
'1551': Ellen_Engleman
'1552': Ellen_MacArthur
'1553': Ellen_Martin
'1554': Ellen_Pompeo
'1555': Ellen_Saracini
'1556': Elliott_Mincberg
'1557': Elmar_Brok
'1558': Elodie_Bouchez
'1559': Eloy_Gutierrez
'1560': Elsa_Zylberstein
'1561': Elton_John
'1562': Elva_Hsiao
'1563': Elvis_Costello
'1564': Elvis_Presley
'1565': Elvis_Stojko
'1566': Emanuel_Ginobili
'1567': Emelie_Loit
'1568': Emile_Lahoud
'1569': Emilio_Azcarraga_Jean
'1570': Emilio_Botin
'1571': Emily_Mason
'1572': Emily_Mortimer
'1573': Emily_Robison
'1574': Emily_Stevens
'1575': Eminem
'1576': Emma_Nicholson
'1577': Emma_Thompson
'1578': Emma_Watson
'1579': Emmanuel_Filiberto
'1580': Emmanuel_Milingo
'1581': Emmanuelle_Beart
'1582': Emmanuelle_Jagodsinski
'1583': Emmit_Smith
'1584': Emmy_Rossum
'1585': Emyr_Jones_Parry
'1586': Enola_Rice
'1587': Enos_Slaughter
'1588': Enrica_Fico
'1589': Enrik_Vendt
'1590': Enrique_Bolanos
'1591': Enrique_Haroldo_Gorriaran_Merlo
'1592': Enrique_Iglesias
'1593': Enrique_Medina_Gomez
'1594': Enrique_Oliu
'1595': Eric_Bana
'1596': Eric_Benet
'1597': Eric_Christian_Olsen
'1598': Eric_Clapton
'1599': Eric_Daze
'1600': Eric_Dubin
'1601': Eric_Fehr
'1602': Eric_Hinske
'1603': Eric_Idle
'1604': Eric_Lindros
'1605': Eric_Lloyd
'1606': Eric_Robert_Rudolph
'1607': Eric_Rosser
'1608': Eric_Ryan_Donnelly
'1609': Eric_Schacht
'1610': Eric_Shinseki
'1611': Eric_Snow
'1612': Eric_Staal
'1613': Eric_Taino
'1614': Eric_Vigouroux
'1615': Eric_Wedge
'1616': Erick_Barkley
'1617': Erik_Morales
'1618': Erika_Christensen
'1619': Erika_Harold
'1620': Erika_Reyes
'1621': Eriko_Tamura
'1622': Erin_Brockovich
'1623': Erin_Hershey_Presley
'1624': Erin_Runnion
'1625': Ernest_Hollings
'1626': Ernesto_Zedillo
'1627': Ernie_Els
'1628': Ernie_Eves
'1629': Ernie_Fletcher
'1630': Ernie_Grunfeld
'1631': Ernie_Harwell
'1632': Ernie_Preate
'1633': Ernie_Stewart
'1634': Erskine_Bowles
'1635': Erwin_Abdullah
'1636': Erwin_Mapasseng
'1637': Esad_Landzo
'1638': Esteban_Cordoba-Velazquez
'1639': Estella_Warren
'1640': Estelle_Morris
'1641': Ester_Canadas
'1642': Esther_Macklin
'1643': Ethan_Hawke
'1644': Etta_James
'1645': Eugene_Melnyk
'1646': Eugene_Teslovic
'1647': Eunice_Barber
'1648': Eurico_Guterres
'1649': Eva_Amurri
'1650': Eva_Dimas
'1651': Eva_Herzigova
'1652': Eva_Marie_Saint
'1653': Eva_Mendes
'1654': Evan_Marriott
'1655': Evan_Rachel_Wood
'1656': Evander_Holyfield
'1657': Eve_Ensler
'1658': Eve_Pelletier
'1659': Evelyn_Lauder
'1660': Evgeni_Plushenko
'1661': Evie_Lazarou
'1662': Evo_Morales
'1663': Ewan_McGregor
'1664': Fabian_Vargas
'1665': Fabiola_Zuluaga
'1666': Fabrice_Santoro
'1667': Fabricio_Oberto
'1668': Faisal_Iqbal
'1669': Faisal_Saleh_Hayat
'1670': Fann_Wong
'1671': Farida_Ragoonanan
'1672': Farouk_Kaddoumi
'1673': Farouk_al-Sharaa
'1674': Fatma_Kusibeh
'1675': Fatmir_Limaj
'1676': Faye_Alibocus
'1677': Faye_Dunaway
'1678': Faye_Wong
'1679': Fayssal_Mekdad
'1680': Fazal-ur-Rehman
'1681': Federico_Castelan_Sayre
'1682': Federico_Fellini
'1683': Federico_Trillo
'1684': Feliciano_Lopez
'1685': Felicity_Huffman
'1686': Felipe_De_Borbon
'1687': Felipe_Fernandez
'1688': Felipe_Perez_Roque
'1689': Felix_Doh
'1690': Felix_Mantilla
'1691': Felix_Sanchez
'1692': Felix_Trinidad
'1693': Ferenc_Madl
'1694': Fernando_Alonso
'1695': Fernando_Gonzalez
'1696': Fernando_Henrique_Cardoso
'1697': Fernando_Hierro
'1698': Fernando_Leon_de_Aranoa
'1699': Fernando_Sanz
'1700': Fernando_Valenzuela
'1701': Fernando_Vargas
'1702': Fernando_Velardez
'1703': Festus_Mogae
'1704': Fidel_Castro
'1705': Fidel_Castro_Daiz-Balart
'1706': Filip_De_Winter
'1707': Filippo_Inzaghi
'1708': Filippo_Volandri
'1709': Fiona_Milne
'1710': Flavia_Delaroli
'1711': Flavia_Pennetta
'1712': Flor_Montulo
'1713': Florecita_Cobian
'1714': Florencia_Kirchner
'1715': Florencia_Macri
'1716': Floyd_Keith
'1717': Floyd_Mayweather
'1718': Fran_Drescher
'1719': Frances_Fisher
'1720': Francesco_Totti
'1721': Francis_Collins
'1722': Francis_Crick
'1723': Francis_Ford_Coppola
'1724': Francis_George
'1725': Francis_Mer
'1726': Francis_Ricciardone
'1727': Francisco_Flores
'1728': Francisco_Garcia
'1729': Francisco_Maturana
'1730': Francisco_Santos
'1731': Francisco_Urenda
'1732': Franck_Cerutti
'1733': Franco_Cangele
'1734': Franco_Dragone
'1735': Franco_Frattini
'1736': Francois_Botha
'1737': Francois_Ozon
'1738': Francois_Pienaar
'1739': Frank_Abagnale_Jr
'1740': Frank_Beamer
'1741': Frank_Bell
'1742': Frank_Cassell
'1743': Frank_Coraci
'1744': Frank_Dunham_Jr
'1745': Frank_Griswold
'1746': Frank_Hilldrup
'1747': Frank_Hsieh
'1748': Frank_Keating
'1749': Frank_Lautenberg
'1750': Frank_Marshall
'1751': Frank_Murkowski
'1752': Frank_Pallone
'1753': Frank_Schmoekel
'1754': Frank_Shea
'1755': Frank_Sinatra
'1756': Frank_Solich
'1757': Frank_Stallone
'1758': Frank_Taylor
'1759': Frank_Van_Ecke
'1760': Frank_Wycheck
'1761': Frank_Zappa
'1762': Franklin_Brown
'1763': Franklin_Damann
'1764': Franko_Simatovic
'1765': Franz_Beckenbauer
'1766': Franz_Fischler
'1767': Franz_Gsell
'1768': Franz_Muentefering
'1769': Fred_Durst
'1770': Fred_Eckhard
'1771': Fred_Funk
'1772': Fred_Huff
'1773': Fred_Rogers
'1774': Fred_Swan
'1775': Fred_Thompson
'1776': Fred_Wilpon
'1777': Freda_Black
'1778': Freddy_Garcia
'1779': Freddy_Vasques_Kinchokpe
'1780': Frederick_Madden
'1781': Frederique_van_der_Wal
'1782': Fredric_Seaman
'1783': Fruit_Chan
'1784': Fujio_Cho
'1785': Fujio_Mitarai
'1786': GL_Peiris
'1787': Gabi_Zimmer
'1788': Gabriel_Batistuta
'1789': Gabriel_Farhi
'1790': Gabriel_Hughes
'1791': Gabriel_Jorge_Ferreia
'1792': Gabriel_Valdes
'1793': Gabriella_Bo
'1794': Gabrielle_Rose
'1795': Gabrielle_Union
'1796': Gael_Garcia_Bermal
'1797': Gala_Leon_Garcia
'1798': Galen_Rowell
'1799': Gao_Qiang
'1800': Garry_Alejano
'1801': Garry_Kasparov
'1802': Garry_McCoy
'1803': Garry_Trudeau
'1804': Garry_Witherall
'1805': Garth_Drabinsky
'1806': Gary_Bald
'1807': Gary_Barnett
'1808': Gary_Bauer
'1809': Gary_Bergeron
'1810': Gary_Bettman
'1811': Gary_Carter
'1812': Gary_Coleman
'1813': Gary_Condit
'1814': Gary_Dellaverson
'1815': Gary_Doer
'1816': Gary_Forsee
'1817': Gary_Gero
'1818': Gary_Gitnick
'1819': Gary_Leon_Ridgway
'1820': Gary_Locke
'1821': Gary_Marshall
'1822': Gary_Paer
'1823': Gary_Sayler
'1824': Gary_Sinise
'1825': Gary_Stevens
'1826': Gary_Williams
'1827': Gary_Winnick
'1828': Gaston_Gaudio
'1829': Gavin_Degraw
'1830': Gavyn_Arthur
'1831': Gavyn_Davies
'1832': Gen_Meredith
'1833': Gene_Autry
'1834': Gene_Hackman
'1835': Gene_Keady
'1836': Gene_Orza
'1837': Gene_Robinson
'1838': Gene_Sauers
'1839': Gennifer_Flowers
'1840': Geno_Auriemma
'1841': Geoff_Dixon
'1842': Geoff_Hoon
'1843': Geoffrey_Davis
'1844': Geoffrey_Rush
'1845': George_Allen
'1846': George_Blaney
'1847': George_Bovell
'1848': George_Brumley
'1849': George_Brumley_III
'1850': George_Clooney
'1851': George_Foreman
'1852': George_Galloway
'1853': George_Gregan
'1854': George_HW_Bush
'1855': George_Harrison
'1856': George_Karl
'1857': George_Lopez
'1858': George_Lucas
'1859': George_Maxwell_Richards
'1860': George_McCloud
'1861': George_Murphy
'1862': George_P_Bush
'1863': George_Papandreou
'1864': George_Pataki
'1865': George_Plimpton
'1866': George_Robertson
'1867': George_Roy_Hill
'1868': George_Ryan
'1869': George_Tenet
'1870': George_Voinovich
'1871': George_W_Bush
'1872': Georgi_Parvanov
'1873': Georgia_Giddings
'1874': Georgina_Bardach
'1875': Georgina_Papin
'1876': Geovani_Lapentti
'1877': Gerald_Barbarito
'1878': Gerald_Calabrese
'1879': Gerald_Fitch
'1880': Gerald_Ford
'1881': Gerald_Riley
'1882': Geraldine_Chaplin
'1883': Geraldo_Rivera
'1884': Gerard_Butler
'1885': Gerard_Depardieu
'1886': Gerard_Kleisterlee
'1887': Gerard_Tronche
'1888': Gerard_de_Cortanze
'1889': Gerardo_Gambala
'1890': Gerhard_Boekel
'1891': Gerhard_Schmid
'1892': Gerhard_Schroeder
'1893': German_Khan
'1894': Gerrit_Zalm
'1895': Gerry_Adams
'1896': Gerry_Kelly
'1897': Gerry_Parsky
'1898': Ghassan_Elashi
'1899': Gholamreza_Aghazadeh
'1900': Gian_Marco
'1901': Giancarlo_Fisichella
'1902': Gianna_Angelopoulos-Daskalaki
'1903': Gianni_Agnelli
'1904': Giannina_Facio
'1905': Gideon_Black
'1906': Gideon_Yago
'1907': Gil_Cates
'1908': Gil_de_Ferran
'1909': Gilberto_Rodriguez_Orejuela
'1910': Gilberto_Simoni
'1911': Gilles_Panizzi
'1912': Gillian_Anderson
'1913': Gina_Centrello
'1914': Gina_Gershon
'1915': Gina_Lollobrigida
'1916': Gina_Torres
'1917': Giovanny_Cordoba
'1918': Gisele_Bundchen
'1919': Giselle_Estefania_Tavarelli
'1920': Giulietta_Masina
'1921': Giulio_Andreotti
'1922': Giuseppe_Gibilisco
'1923': Giuseppe_Morchio
'1924': Glafcos_Clerides
'1925': Glen_Clark
'1926': Glen_DaSilva
'1927': Glen_Sather
'1928': Glenn_Plummer
'1929': Glenn_Rivers
'1930': Glenn_Tilton
'1931': Gloria_Allred
'1932': Gloria_Gaynor
'1933': Gloria_Macapagal_Arroyo
'1934': Gloria_Trevi
'1935': Goh_Kun
'1936': Goldie_Hawn
'1937': Gong_Li
'1938': Gong_Ruina
'1939': Gonzalo_Barrientos
'1940': Gonzalo_Sanchez_de_Lozada
'1941': Goran_Persson
'1942': Goran_Zivkovic
'1943': Gordana_Grubin
'1944': Gorden_Tallis
'1945': Gordon_Brown
'1946': Gordon_Campbell
'1947': Gordon_Cooper
'1948': Gordon_Lightfoot
'1949': Gordon_McDonald
'1950': Gore_Verbinski
'1951': Gore_Vidal
'1952': Grace_Brinell
'1953': Grace_Dodd
'1954': Grace_Kelly
'1955': Gracia_Burnham
'1956': Graciano_Rocchigiani
'1957': Grady_Irvin_Jr
'1958': Grady_Little
'1959': Graeme_Lloyd
'1960': Graeme_Smith
'1961': Graham_Bentley
'1962': Grant_Hackett
'1963': Grant_Rossenmeyer
'1964': Gray_Davis
'1965': Greg_Frers
'1966': Greg_Gilbert
'1967': Greg_Hennigar
'1968': Greg_Hodge
'1969': Greg_Kinnear
'1970': Greg_Kinsey
'1971': Greg_Ostertag
'1972': Greg_Owen
'1973': Greg_Rusedski
'1974': Gregg_Berhalter
'1975': Gregg_Popovich
'1976': Gregor_Gysi
'1977': Gregorio_Honasan
'1978': Gregorio_Rosal
'1979': Gregory_Geoffroy
'1980': Gregory_Hines
'1981': Gregory_Peck
'1982': Gretchen_Mol
'1983': Griffin_Colvin
'1984': Gro_Harlem_Brundtland
'1985': Guangdong_Ou_Guangyuan
'1986': Guennadi_Chipouline
'1987': Guenter_Verheugen
'1988': Guido_Westerwelle
'1989': Guillaume_Cannet
'1990': Guillaume_Depardieu
'1991': Guillaume_Soro
'1992': Guillermo_Canas
'1993': Guillermo_Coria
'1994': Guillermo_Monroy
'1995': Guillermo_Ortiz
'1996': Guillermo_Ruiz_Polanco
'1997': Gunilla_Backman
'1998': Gunter_Pleuger
'1999': Gus_Frerotte
'2000': Gus_Van_Sant
'2001': Gustavo_Cisneros
'2002': Gustavo_Franco
'2003': Gustavo_Kuerten
'2004': Gustavo_Noboa
'2005': Gustavo_Terrazas
'2006': Guus_Hiddink
'2007': Guy_Hemmings
'2008': Guy_Ritchie
'2009': Guy_Verhofstadt
'2010': Gwen_Stefani
'2011': Gwendal_Peizerat
'2012': Gwyneth_Paltrow
'2013': Habib_Hisham
'2014': Habib_Rizieq
'2015': Hadley_Bilger
'2016': Hal_Gehman
'2017': Hal_McCoy
'2018': Hal_Sellers
'2019': Hal_Sutton
'2020': Halbert_Fillinger
'2021': Halle_Berry
'2022': Ham_Pong-sil
'2023': Hama_Arba_Diallo
'2024': Hamad_Bin_Isa_al-Khalifa
'2025': Hamad_Bin_Jassim
'2026': Hamid_Efendi
'2027': Hamid_Karzai
'2028': Hamid_Reza_Asefi
'2029': Hamza_Atiya_Muhsen
'2030': Hamzah_Haz
'2031': Han_Sung_Joo
'2032': Hana_Makhmalbaf
'2033': Hana_Sadiq
'2034': Hana_Urushima
'2035': Hanan_Ashrawi
'2036': Hank_Aaron
'2037': Hank_Azaria
'2038': Hank_Bass
'2039': Hank_McKinnell
'2040': Hank_Stram
'2041': Hannah_Stockbauer
'2042': Hanns_Schumacher
'2043': Hans-Christian_Schmid
'2044': Hans_Blix
'2045': Hans_Corell
'2046': Hans_Eichel
'2047': Hans_Leistritz
'2048': Hans_Peter_Briegel
'2049': Harald_Ringstorff
'2050': Harbhajan_Singh
'2051': Harland_Braun
'2052': Harold_Brown
'2053': Harold_Scott
'2054': Harriet_Lessy
'2055': Harrison_Ford
'2056': Harry_Belafonte
'2057': Harry_Kalas
'2058': Harry_Schmidt
'2059': Hartmut_Mehdorn
'2060': Harvey_Fierstein
'2061': Harvey_Wachsman
'2062': Harvey_Weinstein
'2063': Hasan_Wirayuda
'2064': Hashan_Tillakaratne
'2065': Hashim_Thaci
'2066': Hassan_Nasrallah
'2067': Hassan_Wirajuda
'2068': Hassanal_Bolkiah
'2069': Hatsui_Hasuike
'2070': Haydar_Aliyev
'2071': Hayden_Panettiere
'2072': Hayley_Tullett
'2073': Heath_Ledger
'2074': Heather_Chinnock
'2075': Heather_Locklear
'2076': Heather_Mills
'2077': Heather_Whitestone_McCallum
'2078': Heather_Willson
'2079': Hector_Babenco
'2080': Hector_Grullon
'2081': Hector_Mitelman
'2082': Hedayat_Amin_Arsala
'2083': Hee-Won_Han
'2084': Heidi_Fleiss
'2085': Heidi_Klum
'2086': Heinrich_Wolfgang
'2087': Heinz_Feldmann
'2088': Heizo_Takenaka
'2089': Helen_Alvare
'2090': Helen_Clark
'2091': Helen_Darling
'2092': Helena_Schneider
'2093': Helene_Eksterowicz
'2094': Helio_Castroneves
'2095': Helio_Rubens_Garcia
'2096': Helmut_Panke
'2097': Helo_Pinheiro
'2098': Henk_Bekedam
'2099': Henning_Scherf
'2100': Henri_Proglio
'2101': Henrique_Meirelles
'2102': Henry_Castellanos
'2103': Henry_Hilow
'2104': Henry_Hyde
'2105': Henry_Kissinger
'2106': Henry_Suazo
'2107': Herb_Brooks
'2108': Herb_Dhaliwal
'2109': Herb_Ritts
'2110': Herb_Sendek
'2111': Herbert_Haupt
'2112': Herbie_Hancock
'2113': Herman_Edwards
'2114': Herman_Moore
'2115': Hermando_Harton
'2116': Hermann_Maier
'2117': Hermes_Gamonal
'2118': Hermogenes_Ebdane_Jr
'2119': Hernan_Crespo
'2120': Hernan_Diaz
'2121': Herta_Daeubler-Gmelin
'2122': Hestrie_Cloette
'2123': Hichiro_Naemura
'2124': Hideki_Matsui
'2125': Hideki_Sato
'2126': Hidetoshi_Nakata
'2127': Hikmat_al-Azzawi
'2128': Hilary_Duff
'2129': Hilary_McKay
'2130': Hilda_Fortune
'2131': Hillary_Clinton
'2132': Hilmi_Akin_Zorlu
'2133': Hilmi_Ozkok
'2134': Himmler_Rebu
'2135': Hipolito_Mejia
'2136': Hiroki_Gomi
'2137': Hiroyuki_Yoshino
'2138': Hisao_Oguchi
'2139': Hisashi_Owada
'2140': Hisham_Halawi
'2141': Hitomi_Soga
'2142': Hitoshi_Oshitani
'2143': Hitoshi_Tanaka
'2144': Hoda_Asfor
'2145': Holly_Hunter
'2146': Holly_Robinson_Peete
'2147': Hong_Myung
'2148': Hootie_Johnson
'2149': Horace_Donovan_Reid
'2150': Horace_Newcomb
'2151': Horacio_Julio_Pina
'2152': Horacio_de_Jesus_Montoya
'2153': Horst_Koehler
'2154': Hosni_Mubarak
'2155': Howard_Dean
'2156': Howard_Ross
'2157': Howard_Schultz
'2158': Howard_Smith
'2159': Howard_Stern
'2160': Howard_Stringer
'2161': Howard_Wilkinson
'2162': Hrithik_Roshan
'2163': Hu_Jintao
'2164': Hu_Maoyuan
'2165': Huan_Chung_Yi
'2166': Huang_Suey-Sheng
'2167': Hubert_Green
'2168': Hubie_Brown
'2169': Hugh_Campbell
'2170': Hugh_Carey
'2171': Hugh_Grant
'2172': Hugh_Hefner
'2173': Hugh_Jessiman
'2174': Hugh_Miller
'2175': Hugo_Chavez
'2176': Hugo_Colace
'2177': Hugo_Conte
'2178': Humberto_Coelho
'2179': Humberto_Espinoza
'2180': Hun_Sen
'2181': Hung_Wan-ting
'2182': Hunter_Bates
'2183': Hunter_Kemper
'2184': Hushiar_Zebari
'2185': Hussam_Mohammed_Amin
'2186': Hussein_Malik
'2187': Hutomo_Mandala_Putra
'2188': Hwang_Doo-yun
'2189': Iain_Anderson
'2190': Iain_Duncan_Smith
'2191': Iain_Richmond
'2192': Ian_Campbell
'2193': Ian_Gillan
'2194': Ian_Huntley
'2195': Ian_Knop
'2196': Ian_McKellen
'2197': Ian_Moran
'2198': Ian_Smith
'2199': Ian_Thorpe
'2200': Ian_Wilmut
'2201': Iban_Mayo
'2202': Ibrahim_Al-Marashi
'2203': Ibrahim_Haddad
'2204': Ibrahim_Hilal
'2205': Ibrahim_Jaafari
'2206': Ibrahim_Rugova
'2207': Idi_Amin
'2208': Ignacio_Antonio_Velasco
'2209': Ignatius_Wang
'2210': Igor_Ivanov
'2211': Igor_Trunov
'2212': Ilan_Goldfajn
'2213': Ilan_Ramon
'2214': Ilham_Aliev
'2215': Ilie_Nastase
'2216': Imad_Khadduri
'2217': Imad_Moustapha
'2218': Imam_Samudra
'2219': Imelda_Marcos
'2220': Imran_Khan
'2221': Imre_Kertasz
'2222': Inam-ul-Haq
'2223': Infanta_Cristina
'2224': Inga_Hall
'2225': Ingrid_Betancourt
'2226': Inocencio_Arias
'2227': Intisar_Ajouri
'2228': Ion_Iliescu
'2229': Ion_Tiriac
'2230': Ira_Einhorn
'2231': Iran_Brown
'2232': Irene_Kahn
'2233': Irfan_Ahmed
'2234': Irina_Framtsova
'2235': Irina_Lobacheva
'2236': Irina_Yatchenko
'2237': Irv_Nathan
'2238': Irwan_Fadzi_Idris
'2239': Isabel_Orellana
'2240': Isabela_Moraes
'2241': Isabella_Rossellini
'2242': Isabelle_Huppert
'2243': Isaiah_Washington
'2244': Ishaq_Shahryar
'2245': Isidro_Pastor
'2246': Islam_Karimov
'2247': Ismael_Miranda
'2248': Ismail_Abu_Shanab
'2249': Ismail_Cem
'2250': Ismail_Khan
'2251': Ismail_Merchant
'2252': Itamar_Franco
'2253': Itzhak_Perlman
'2254': Iva_Majoli
'2255': Ivan_Helguera
'2256': Ivan_Lee
'2257': Ivan_Shvedoff
'2258': Ivan_Stambolic
'2259': Ivana_Trump
'2260': Iveta_Benesova
'2261': Ivo_Dubs
'2262': Izzat_Ibrahim
'2263': JC_Chasez
'2264': JJ_Redick
'2265': JK_Rowling
'2266': JP_Suarez
'2267': JT_Snow
'2268': Jaap_de_Hoop_Scheffer
'2269': Jack_Goodman
'2270': Jack_Grubman
'2271': Jack_Knowlton
'2272': Jack_LaLanne
'2273': Jack_Nicholson
'2274': Jack_Osbourne
'2275': Jack_Smith
'2276': Jack_Straw
'2277': Jack_Valenti
'2278': Jack_Welch
'2279': Jackie_Chan
'2280': Jackie_Dennis
'2281': Jackie_Sherrill
'2282': Jacky_Cheung
'2283': Jacob_Frenkel
'2284': Jacqueline_Edwards
'2285': Jacqueline_Gold
'2286': Jacqueline_Marris
'2287': Jacqueline_Obradors
'2288': Jacques_Chirac
'2289': Jacques_Kallis
'2290': Jacques_Rogge
'2291': Jacques_Villeneuve
'2292': Jada_Pinkett_Smith
'2293': Jade_Jagger
'2294': Jafar_Umar_Thalib
'2295': Jaime_Orti
'2296': Jaime_Pressly
'2297': Jake_Brace
'2298': Jake_Gyllenhaal
'2299': Jake_Plummer
'2300': Jakob_Kellenberger
'2301': Jalal_Talabani
'2302': Jalen_Rose
'2303': James_Baker
'2304': James_Ballenger
'2305': James_Barksdale
'2306': James_Becker
'2307': James_Blake
'2308': James_Brazelton
'2309': James_Brosnahan
'2310': James_Brown
'2311': James_Butts
'2312': James_Caan
'2313': James_Cameron
'2314': James_Carville
'2315': James_Coburn
'2316': James_Collinson
'2317': James_Comey
'2318': James_Coviello
'2319': James_Cunningham
'2320': James_Dingemans
'2321': James_Franco
'2322': James_Gandolfini
'2323': James_Gibson
'2324': James_Hakett
'2325': James_Hallock
'2326': James_Harris
'2327': James_Hill
'2328': James_Hoffa
'2329': James_Hughes
'2330': James_Ivory
'2331': James_Jones
'2332': James_Kelly
'2333': James_Kirtley
'2334': James_Kopp
'2335': James_Layug
'2336': James_Lockhart
'2337': James_Maguire
'2338': James_Mathis
'2339': James_May
'2340': James_McGreevey
'2341': James_McMahon
'2342': James_McPherson
'2343': James_Meeks
'2344': James_Meredeth
'2345': James_Morris
'2346': James_Murdoch
'2347': James_Parker
'2348': James_Phelps
'2349': James_Roberts
'2350': James_Robertson_Jr
'2351': James_Schultz
'2352': James_Sensenbrenner
'2353': James_Smith
'2354': James_Spalding
'2355': James_Traficant
'2356': James_W_Kennedy
'2357': James_Wallack
'2358': James_Watt
'2359': James_Wattana
'2360': James_Williams
'2361': James_Wolfensohn
'2362': James_Young
'2363': Jamie_Carey
'2364': Jamie_Cooke
'2365': Jamie_Dimon
'2366': Jamie_Kellner
'2367': Jamie_King
'2368': Jamie_Lee_Curtis
'2369': Jamie_Martin
'2370': Jamie_Olis
'2371': Jamie_Villafane
'2372': Jamir_Miller
'2373': Jamling_Norgay
'2374': Jan-Michael_Gambill
'2375': Jan_Bjoerklund
'2376': Jan_De_Bont
'2377': Jan_Paul_Miller
'2378': Jan_Peter_Balkenende
'2379': Jan_Petersen
'2380': Jan_Pronk
'2381': Jan_Ullrich
'2382': Jan_van_Breda_Kolff
'2383': Jana_Henke
'2384': Jana_Pittman
'2385': Jane_Clayson
'2386': Jane_Fonda
'2387': Jane_Kaczmarek
'2388': Jane_Krakowski
'2389': Jane_Leeves
'2390': Jane_Menelaus
'2391': Jane_Pauley
'2392': Jane_Riley
'2393': Jane_Rooney
'2394': Jane_Russell
'2395': Jane_Walker_Wood
'2396': Janela_Jara
'2397': Janet_Chandler
'2398': Janet_Crawford
'2399': Janet_Ecker
'2400': Janet_Horvath
'2401': Janet_Leigh
'2402': Janet_Napolitano
'2403': Janet_Thorpe
'2404': Janette_Husarova
'2405': Janez_Drnovsek
'2406': Janica_Kostelic
'2407': Janice_Abreu
'2408': Janice_Goldfinger
'2409': Janine_Pietsch
'2410': Janis_Ruth_Coulter
'2411': Janusz_Kaminski
'2412': Jaouad_Gharib
'2413': Jaqueline_Godoy
'2414': Jaromir_Jagr
'2415': Jason_Alexander
'2416': Jason_Bentley
'2417': Jason_Biggs
'2418': Jason_Campbell
'2419': Jason_Clermont
'2420': Jason_Gardner
'2421': Jason_Jennings
'2422': Jason_Kapono
'2423': Jason_Keep
'2424': Jason_Kidd
'2425': Jason_Lezak
'2426': Jason_Mewes
'2427': Jason_Petty
'2428': Jason_Priestley
'2429': Jason_Sehorn
'2430': Jason_Sorens
'2431': Jason_Statham
'2432': Jason_Vale
'2433': Jason_White
'2434': Javier_Bardem
'2435': Javier_Camara
'2436': Javier_Delgado
'2437': Javier_Saviola
'2438': Javier_Solana
'2439': Javier_Vargas
'2440': Javier_Vazquez
'2441': Javier_Weber
'2442': Javier_Zanetti
'2443': Jawad_Boulus
'2444': Jay_Garner
'2445': Jay_Leno
'2446': Jay_Rasulo
'2447': Jaymon_Crabb
'2448': Jayne_Yarris
'2449': Jayson_Williams
'2450': Jean-Claude_Braquet
'2451': Jean-Claude_Juncker
'2452': Jean-Claude_Trichet
'2453': Jean-Claude_Van_Damme
'2454': Jean-David_Levitte
'2455': Jean-Francois_Lemounier
'2456': Jean-Francois_Pontal
'2457': Jean-Luc_Bideau
'2458': Jean-Marc_Olive
'2459': Jean-Marc_de_La_Sabliere
'2460': Jean-Patrick_Nazon
'2461': Jean-Pierre_Bemba
'2462': Jean-Pierre_Raffarin
'2463': Jean-Rene_Fourtou
'2464': Jean-Sebastien_Giguere
'2465': Jean_Brumley
'2466': Jean_Carnahan
'2467': Jean_Charest
'2468': Jean_Chretien
'2469': Jean_Nagel
'2470': Jean_Todt
'2471': Jeane_Kirkpatrick
'2472': Jeanette_Gray
'2473': Jeanette_Stauffer
'2474': Jeanne_Anne_Schroeder
'2475': Jeanne_Moreau
'2476': Jeannette_Biedermann
'2477': Jeb_Bush
'2478': Jeff_Bridges
'2479': Jeff_Bzdelik
'2480': Jeff_Dederian
'2481': Jeff_Feldman
'2482': Jeff_George
'2483': Jeff_Hornacek
'2484': Jeff_Roehm
'2485': Jeff_Schiffner
'2486': Jeff_Van_Gundy
'2487': Jeff_Weaver
'2488': Jefferson_Perez
'2489': Jeffery_Hendren
'2490': Jeffery_Strelzin
'2491': Jeffrey_Archer
'2492': Jeffrey_Ashby
'2493': Jeffrey_Donaldson
'2494': Jeffrey_Immelt
'2495': Jeffrey_Jones
'2496': Jeffrey_Katzenberg
'2497': Jeffrey_Pfeffer
'2498': Jeffrey_Scott_Postell
'2499': Jelena_Dokic
'2500': Jen_Bice
'2501': Jen_Schefft
'2502': Jenna_Elfman
'2503': Jennette_Bradley
'2504': Jennie_Finch
'2505': Jennie_Garth
'2506': Jennifer_Aniston
'2507': Jennifer_Capriati
'2508': Jennifer_Connelly
'2509': Jennifer_Furminger
'2510': Jennifer_Garner
'2511': Jennifer_Granholm
'2512': Jennifer_Gratz
'2513': Jennifer_Keller
'2514': Jennifer_Lopez
'2515': Jennifer_Love_Hewitt
'2516': Jennifer_McCoy
'2517': Jennifer_Murray
'2518': Jennifer_Pena
'2519': Jennifer_Reilly
'2520': Jennifer_Renee_Short
'2521': Jennifer_Rodriguez
'2522': Jennifer_Thompson
'2523': Jennifer_Tilly
'2524': Jenny_Romero
'2525': Jens_Lehmann
'2526': Jeong_Se-hyun
'2527': Jerelle_Kraus
'2528': Jeremy_Fogel
'2529': Jeremy_Gompertz
'2530': Jeremy_Greenstock
'2531': Jeremy_Shockey
'2532': Jeremy_Wotherspoon
'2533': Jeri_Ryan
'2534': Jerome_Golmard
'2535': Jerome_Jenkins
'2536': Jerry_Angelo
'2537': Jerry_Bruckheimer
'2538': Jerry_Colangelo
'2539': Jerry_Falwell
'2540': Jerry_Hall
'2541': Jerry_Jones
'2542': Jerry_Lewis
'2543': Jerry_McEntee
'2544': Jerry_Oliver
'2545': Jerry_Pauley
'2546': Jerry_Regier
'2547': Jerry_Rice
'2548': Jerry_Seinfeld
'2549': Jerry_Sexton
'2550': Jerry_Sloan
'2551': Jerry_Springer
'2552': Jerry_Tarkanian
'2553': Jesper_Parnevik
'2554': Jesse_Harris
'2555': Jesse_Helms
'2556': Jesse_Jackson
'2557': Jesse_James
'2558': Jesse_James_Leija
'2559': Jesse_Ventura
'2560': Jessica_Alba
'2561': Jessica_Biel
'2562': Jessica_Brungo
'2563': Jessica_Capshaw
'2564': Jessica_Lange
'2565': Jessica_Lynch
'2566': Jessica_Simpson
'2567': Jesus_Cardenal
'2568': Jewel_Howard-Taylor
'2569': Jia_Qinglin
'2570': Jiang_Zemin
'2571': Jim_Abbott
'2572': Jim_Ahern
'2573': Jim_Anderson
'2574': Jim_Beattie
'2575': Jim_Bollman
'2576': Jim_Bunning
'2577': Jim_Calhoun
'2578': Jim_Cantalupo
'2579': Jim_Carrey
'2580': Jim_Doyle
'2581': Jim_Edmonds
'2582': Jim_Fassel
'2583': Jim_Flaherty
'2584': Jim_Freudenberg
'2585': Jim_Furyk
'2586': Jim_Greenwood
'2587': Jim_Hahn
'2588': Jim_Hardin
'2589': Jim_Harrick
'2590': Jim_Haslett
'2591': Jim_Hendry
'2592': Jim_Jeffords
'2593': Jim_Kelly
'2594': Jim_Leach
'2595': Jim_Letten
'2596': Jim_Nochols
'2597': Jim_OBrien
'2598': Jim_Otto
'2599': Jim_Parque
'2600': Jim_Paxson
'2601': Jim_Piper
'2602': Jim_Ryan
'2603': Jim_Schwarz
'2604': Jim_Spinoza
'2605': Jim_Sterk
'2606': Jim_Talent
'2607': Jim_Taylor
'2608': Jim_Thome
'2609': Jim_Tressel
'2610': Jim_Wall
'2611': Jim_Wessling
'2612': Jim_Wong
'2613': Jim_Zorn
'2614': Jimmy_Carter
'2615': Jimmy_Gobble
'2616': Jimmy_Gurule
'2617': Jimmy_Iovine
'2618': Jimmy_Jimenez
'2619': Jimmy_Kimmel
'2620': Jimmy_Lee
'2621': Jimmy_Smits
'2622': Jimmy_Szymanski
'2623': Jiri_Novak
'2624': Jo_Dee_Messina
'2625': Jo_Joong-hyon
'2626': Joan_Claybrook
'2627': Joan_Collins
'2628': Joan_Dangerfield
'2629': Joan_Jett
'2630': Joan_Laporta
'2631': Joanna_Poitier
'2632': Joanne_Duquette
'2633': Joanne_Woodward
'2634': Joao_Rocha
'2635': Joaquim_Levy
'2636': Joaquim_Rodriguez
'2637': Joaquin_Phoenix
'2638': Joaquin_Sanchez
'2639': Job_Cohen
'2640': Jodie_Foster
'2641': Jodie_Henry
'2642': Jodie_Kidd
'2643': Jody_Richards
'2644': Joe_Calzaghe
'2645': Joe_Carnahan
'2646': Joe_Cocker
'2647': Joe_Cravens
'2648': Joe_Crede
'2649': Joe_Darrell
'2650': Joe_DeLamielleure
'2651': Joe_Dicaro
'2652': Joe_Dumars
'2653': Joe_Finley
'2654': Joe_Friedberg
'2655': Joe_Garner
'2656': Joe_Gatti
'2657': Joe_Glover
'2658': Joe_Leonard
'2659': Joe_Lieberman
'2660': Joe_Mantegna
'2661': Joe_Mantello
'2662': Joe_Mendes
'2663': Joe_Metz
'2664': Joe_Nichols
'2665': Joe_Pantoliano
'2666': Joe_Paterno
'2667': Joe_Plumeri
'2668': Joe_Strummer
'2669': Joe_Torre
'2670': Joe_Vandever
'2671': Joel_Gallen
'2672': Joel_Todd
'2673': Joerg_Haider
'2674': Joey_Buttafuoco
'2675': Joey_Harrington
'2676': Joey_Mantia
'2677': Johan_Bruyneel
'2678': Johannes_Rau
'2679': John_Abizaid
'2680': John_Allen_Muhammad
'2681': John_Anderson
'2682': John_Ashcroft
'2683': John_Baldacci
'2684': John_Banko
'2685': John_Barnett
'2686': John_Belushi
'2687': John_Blaney
'2688': John_Bolton
'2689': John_Bond
'2690': John_Brady
'2691': John_Burkett
'2692': John_Burnett
'2693': John_Connolly
'2694': John_Coomber
'2695': John_Cornyn
'2696': John_Cruz
'2697': John_Cusack
'2698': John_Dallager
'2699': John_Daly_Jr
'2700': John_Danforth
'2701': John_Darby
'2702': John_Duprey
'2703': John_Eastman
'2704': John_Eder
'2705': John_Edwards
'2706': John_Elway
'2707': John_Engler
'2708': John_F_Kennedy_Jr
'2709': John_Fenn
'2710': John_Ferguson
'2711': John_Fox
'2712': John_Franco
'2713': John_Garamendi
'2714': John_Geoghan
'2715': John_Goold
'2716': John_Gordnick
'2717': John_Gruden
'2718': John_Hartson
'2719': John_Henry
'2720': John_Herrington
'2721': John_Howard
'2722': John_Jones
'2723': John_Jumper
'2724': John_Kerr
'2725': John_Kerry
'2726': John_Lawrence
'2727': John_Leguizamo
'2728': John_Lennon
'2729': John_Lisowski
'2730': John_Lithgow
'2731': John_Lynch
'2732': John_Mabry
'2733': John_Madden
'2734': John_Malkovich
'2735': John_Manley
'2736': John_Marburger
'2737': John_Mayer
'2738': John_McCain
'2739': John_McCallum
'2740': John_McCormack
'2741': John_McEnroe
'2742': John_McKay
'2743': John_Moe
'2744': John_Moxley
'2745': John_Nash
'2746': John_Negroponte
'2747': John_Nimmo
'2748': John_Norquist
'2749': John_Paul_DeJoria
'2750': John_Paul_II
'2751': John_Perrota
'2752': John_Petty
'2753': John_Philip_Elkann
'2754': John_Prescott
'2755': John_Reid
'2756': John_Reilly
'2757': John_Richardson
'2758': John_Rigas
'2759': John_Robbins
'2760': John_Rosa
'2761': John_Rowe
'2762': John_Rowland
'2763': John_Ruiz
'2764': John_Rusnak
'2765': John_Salazar
'2766': John_Scarlett
'2767': John_Sidgmore
'2768': John_Snow
'2769': John_Spencer
'2770': John_Stallworth
'2771': John_Starks
'2772': John_Stockton
'2773': John_Sununu
'2774': John_Sweeney
'2775': John_Swofford
'2776': John_Taylor
'2777': John_Thune
'2778': John_Timoney
'2779': John_Travolta
'2780': John_Tyson
'2781': John_Velazquez
'2782': John_Walsh
'2783': John_Warner
'2784': John_Wayne
'2785': John_White
'2786': John_Williams
'2787': John_Wolf
'2788': John_Wright
'2789': Johnnie_Lynn
'2790': Johnny_Benson
'2791': Johnny_Carson
'2792': Johnny_Depp
'2793': Johnny_Hallyday
'2794': Johnny_Htu
'2795': Johnny_Tapia
'2796': Johnny_Unitas
'2797': Johnson_Panjaitan
'2798': Jolanta_Kwasniewski
'2799': Jon_Constance
'2800': Jon_Corzine
'2801': Jon_Gruden
'2802': Jon_Kitna
'2803': Jon_Stewart
'2804': Jon_Voight
'2805': Jonathan_Arden
'2806': Jonathan_Byrd
'2807': Jonathan_Edwards
'2808': Jonathan_Fine
'2809': Jonathan_Horton
'2810': Jonathan_Karsh
'2811': Jonathan_Mostow
'2812': Jonathan_Schroeder
'2813': Jonathan_Tiomkin
'2814': Jonathan_Woodgate
'2815': Jong_Thae_Hwa
'2816': Jong_Wook_Lee
'2817': Jorge_Alberto_Galindo
'2818': Jorge_Arce
'2819': Jorge_Batlle
'2820': Jorge_Castaneda
'2821': Jorge_Enrique_Jimenez
'2822': Jorge_Marquez-Ruarte
'2823': Jorge_Moreno
'2824': Jorge_Quiroga
'2825': Jorge_Rodolfo_Canicoba_Corral
'2826': Jorge_Valdano
'2827': Jorma_Huhtala
'2828': Joschka_Fischer
'2829': Jose_Acasuso
'2830': Jose_Alencar
'2831': Jose_Bove
'2832': Jose_Canseco
'2833': Jose_Canseco_Sr
'2834': Jose_Carlo_Fernandez
'2835': Jose_Carreras
'2836': Jose_Cevallos
'2837': Jose_Dirceu
'2838': Jose_Genoino
'2839': Jose_Jose
'2840': Jose_Lina
'2841': Jose_Lopez_Beltran
'2842': Jose_Luis_Chilavert
'2843': Jose_Luis_Rodriguez_Zapatero
'2844': Jose_Luis_Santiago_Vasconcelos
'2845': Jose_Manuel_Durao_Barroso
'2846': Jose_Maria_Aznar
'2847': Jose_Miguel_Aleman
'2848': Jose_Mourinho
'2849': Jose_Rosado
'2850': Jose_Santos
'2851': Jose_Sarney
'2852': Jose_Serra
'2853': Jose_Theodore
'2854': Jose_Vicente_Rangel
'2855': Jose_Viegas_Filho
'2856': Jose_Woldenberg
'2857': Joseph_Biden
'2858': Joseph_Blatter
'2859': Joseph_Deiss
'2860': Joseph_Estrada
'2861': Joseph_Fiennes
'2862': Joseph_Galante
'2863': Joseph_Ganim
'2864': Joseph_Hoy
'2865': Joseph_Kabila
'2866': Joseph_LePore
'2867': Joseph_Lopez
'2868': Joseph_Nacchio
'2869': Joseph_Ralston
'2870': Joseph_Safra
'2871': Joseph_Salgado
'2872': Josh_Childress
'2873': Josh_Evans
'2874': Josh_Kronfeld
'2875': Joshua_Davey
'2876': Joshua_Gracin
'2877': Joshua_Harapko
'2878': Joshua_Perper
'2879': Joxel_Garcia
'2880': Joy_Bryant
'2881': Joy_Lee_Sadler
'2882': Juan_Antonio_Samaranch
'2883': Juan_Carlos
'2884': Juan_Carlos_Ferrero
'2885': Juan_Carlos_Morales
'2886': Juan_Carlos_Ortega
'2887': Juan_Fernandez
'2888': Juan_Francisco_Palencia
'2889': Juan_Ignacio_Chela
'2890': Juan_Jose_Lucas
'2891': Juan_Manuel_Marquez
'2892': Juan_Pablo_Montoya
'2893': Juan_Roman_Carrasco
'2894': Juan_Roman_Riquelme
'2895': Juan_Sanchez
'2896': Juan_Valencia_Osorio
'2897': Juanes
'2898': Judd_Davies
'2899': Jude_Law
'2900': Judi_Dench
'2901': Judi_Patton
'2902': Judith_Nathan
'2903': Judy_Dean
'2904': Judy_Genshaft
'2905': Judy_Locy
'2906': Judy_Spreckels
'2907': Judy_Vassar
'2908': Juergen_Braehmer
'2909': Juergen_Chrobog
'2910': Juergen_Peters
'2911': Juergen_Schrempp
'2912': Juergen_Trittin
'2913': Jules_Asner
'2914': Julia_Glass
'2915': Julia_Ormond
'2916': Julia_Tymoshenko
'2917': Julian_Battle
'2918': Julian_Fantino
'2919': Julianna_Margulies
'2920': Julianne_Moore
'2921': Julie_Andrews
'2922': Julie_Gerberding
'2923': Julie_Goodenough
'2924': Julie_Infante
'2925': Julie_Taymor
'2926': Julien_Boutter
'2927': Julien_Varlet
'2928': Juliette_Binoche
'2929': Juliette_Lewis
'2930': Julio_Cesar_Chavez
'2931': Julio_Cesar_Franco
'2932': Julio_De_Brun
'2933': Julio_Iglesias_Jr
'2934': Julio_Rossi
'2935': Julio_Toro
'2936': Julius_Barnes
'2937': Julius_Erving
'2938': Juljia_Vysotskij
'2939': Jung_Bong
'2940': Junichi_Inamoto
'2941': Junichiro_Koizumi
'2942': Junko_Tabei
'2943': Justin_Gatlin
'2944': Justin_Guarini
'2945': Justin_Leonard
'2946': Justin_Marshall
'2947': Justin_Timberlake
'2948': Justin_Wilson
'2949': Justine_Henin
'2950': Justine_Pasek
'2951': Kai-Uwe_Ricke
'2952': Kaio_Almeida
'2953': Kaisser_Bazan
'2954': Kajsa_Bergqvist
'2955': Kalid_Kaid
'2956': Kalpana_Chawla
'2957': Kamal_Kharrazi
'2958': Kamel_Morjane
'2959': Kang_Gum-sil
'2960': Kaoru_Hasuike
'2961': Kara_Lynn_Joyce
'2962': Kareena_Kapoor
'2963': Karen_Allen
'2964': Karen_Clarkson
'2965': Karen_Lynn_Gorney
'2966': Karen_Mok
'2967': Karen_Pereiras
'2968': Karen_Sharpe_Kramer
'2969': Karin_Pilsaeter
'2970': Karin_Stoiber
'2971': Karin_Viard
'2972': Karl-Heinz_Rummenigge
'2973': Karol_Kucera
'2974': Kaspar_Villiger
'2975': Katalin_Kollat
'2976': Kate_Burton
'2977': Kate_Capshaw
'2978': Kate_Hudson
'2979': Kate_Lee
'2980': Kate_Moss
'2981': Kate_Richardson
'2982': Kate_Starbird
'2983': Kate_Winslet
'2984': Katerina_Smrzova
'2985': Katharine_Hepburn
'2986': Katherine_Harris
'2987': Kathie_Louise_Saunders
'2988': Kathleen_Abernathy
'2989': Kathleen_Glynn
'2990': Kathleen_Kennedy_Townsend
'2991': Kathryn_Bigelow
'2992': Kathryn_Grayson
'2993': Kathryn_Morris
'2994': Kathryn_Tucker
'2995': Kathy_Baker
'2996': Kathy_Bates
'2997': Kathy_Gannon
'2998': Kathy_Winters
'2999': Katie_Boone
'3000': Katie_Couric
'3001': Katie_Harman
'3002': Katie_Holmes
'3003': Katie_Smith
'3004': Katie_Wagner
'3005': Katja_Riemann
'3006': Katrin_Cartlidge
'3007': Katrin_Susi
'3008': Kay_Bailey_Hutchison
'3009': Kay_Behrensmeyer
'3010': Kaye_Young
'3011': Keanu_Reeves
'3012': Keiko_Sofia_Fujimori
'3013': Keira_Knightley
'3014': Keith_Bishop_Jr
'3015': Keith_Bogans
'3016': Keith_Brown
'3017': Keith_Fotta
'3018': Keith_Foulke
'3019': Keith_Lockhart
'3020': Keith_Lowen
'3021': Keith_Olbermann
'3022': Keith_Osik
'3023': Keith_Rodriguez
'3024': Keith_Snyder
'3025': Keith_Tyson
'3026': Keith_Urban
'3027': Keith_Van_Horn
'3028': Keizo_Yamada
'3029': Kelli_White
'3030': Kellie_Coffey
'3031': Kellie_Greene
'3032': Kelly_Clarkson
'3033': Kelly_Leigh
'3034': Kelly_Osbourne
'3035': Kelly_Ripa
'3036': Kelly_Santos
'3037': Kelsey_Grammer
'3038': Kelvin_Sampson
'3039': Kemal_Dervis
'3040': Ken_Balk
'3041': Ken_Dorsey
'3042': Ken_Kutaragi
'3043': Ken_Loach
'3044': Ken_Macha
'3045': Ken_Watanabe
'3046': Ken_Wharfe
'3047': Kenenisa_Bekele
'3048': Kenneth_Bowersox
'3049': Kenneth_Branagh
'3050': Kenneth_Brill
'3051': Kenneth_Carlsen
'3052': Kenneth_Cooper
'3053': Kenneth_Dam
'3054': Kenneth_Evans
'3055': Kenneth_Reichert
'3056': Kenny_Brack
'3057': Kenny_Chesney
'3058': Kent_McCord
'3059': Kent_Robinson
'3060': Kent_Rominger
'3061': Kevin_Borseth
'3062': Kevin_Costner
'3063': Kevin_Crane
'3064': Kevin_Garnett
'3065': Kevin_Gil
'3066': Kevin_Harvick
'3067': Kevin_Hearn
'3068': Kevin_James
'3069': Kevin_Keegan
'3070': Kevin_Marshall
'3071': Kevin_Millwood
'3072': Kevin_Nealon
'3073': Kevin_Satterfield
'3074': Kevin_Sorbo
'3075': Kevin_Spacey
'3076': Kevin_Stallings
'3077': Kevin_Tarrant
'3078': Khader_Rashid_Rahim
'3079': Khaled_Sawalhi
'3080': Khalid_Khannouchi
'3081': Khalid_Qazi
'3082': Khatol_Mohammad_Zai
'3083': Khin_Nyunt
'3084': Khum_Bahadur_Khadka
'3085': Kieran_Culkin
'3086': Kieran_Prendergast
'3087': Kifah_Ajouri
'3088': Kiki_Vandeweghe
'3089': Kim_Cattrall
'3090': Kim_Chinn
'3091': Kim_Clijsters
'3092': Kim_Dae-jung
'3093': Kim_Dong-hwa
'3094': Kim_Dong-tae
'3095': Kim_Gandy
'3096': Kim_Hong-gul
'3097': Kim_Hong-up
'3098': Kim_Jin-sun
'3099': Kim_Jong-Il
'3100': Kim_Ryong-sung
'3101': Kim_Su_Nam
'3102': Kim_Weeks
'3103': Kim_Yong-il
'3104': Kim_Yun-kyu
'3105': Kimberly_Bruckner
'3106': Kimberly_Stewart
'3107': Kimi_Raikkonen
'3108': Kimora_Lee
'3109': King_Abdullah_II
'3110': King_Bhumibol_Adulyadej
'3111': King_Gyanendra
'3112': Kirby_Puckett
'3113': Kirk_Doerger
'3114': Kirk_Douglas
'3115': Kirk_Ferentz
'3116': Kirk_Franklin
'3117': Kirk_Johnson
'3118': Kirsten_Clark
'3119': Kirsten_Dunst
'3120': Kirsten_Gilham
'3121': Kit_Bond
'3122': Kitin_Munoz
'3123': Kjell_Magne_Bondevik
'3124': Klaus_Schwab
'3125': Klaus_Zwickel
'3126': Kobe_Bryant
'3127': Kofi_Annan
'3128': Koichi_Haraguchi
'3129': Koichi_Tanaka
'3130': Koichiro_Matsuura
'3131': Koji_Uehara
'3132': Kong_Quan
'3133': Kostya_Tszyu
'3134': Kosuke_Kitajima
'3135': Krishna_Bhadur_Mahara
'3136': Kristanna_Loken
'3137': Kristen_Breitweiser
'3138': Kristen_Rivera
'3139': Kristin_Chenoweth
'3140': Kristin_Davis
'3141': Kristin_Scott
'3142': Kristin_Scott_Thomas
'3143': Kristy_Curry
'3144': Kultida_Woods
'3145': Kurt_Budke
'3146': Kurt_Busch
'3147': Kurt_Hellstrom
'3148': Kurt_Russell
'3149': Kurt_Schottenheimer
'3150': Kurt_Suzuki
'3151': Kurt_Tanabe
'3152': Kurt_Thomas
'3153': Kurt_Warner
'3154': Kwame_Kilpatrick
'3155': Kweisi_Mfume
'3156': Kwon_Yang-sook
'3157': Kwon_Young-gil
'3158': Kyle_McLaren
'3159': Kyle_Shewfelt
'3160': Kyoko_Nakayama
'3161': Kyra_Sedgwick
'3162': LK_Advani
'3163': Lachlan_Murdoch
'3164': Laila_Ali
'3165': Lana_Clarkson
'3166': Lance_Armstrong
'3167': Lance_Bass
'3168': Landon_Donovan
'3169': Lane_Bryant
'3170': Lane_Odom
'3171': Lara_Logan
'3172': Larenz_Tate
'3173': Larry_Anderson
'3174': Larry_Beinfest
'3175': Larry_Bowa
'3176': Larry_Brown
'3177': Larry_Campbell
'3178': Larry_Coker
'3179': Larry_Donald
'3180': Larry_Ellison
'3181': Larry_Eustachy
'3182': Larry_Flynt
'3183': Larry_Greene
'3184': Larry_Hagman
'3185': Larry_Hahn
'3186': Larry_Harris
'3187': Larry_Johnson
'3188': Larry_Lindsey
'3189': Larry_Lucchino
'3190': Larry_Nichols
'3191': Larry_Pleau
'3192': Larry_Ralston
'3193': Larry_Tanenbaum
'3194': Larry_Templeton
'3195': Larry_Thompson
'3196': Larry_Wilmore
'3197': Lars_Burgsmuller
'3198': Lars_Von_Trier
'3199': Laszlo_Kovacs
'3200': Latrell_Sprewell
'3201': Laura_Bozzo
'3202': Laura_Bush
'3203': Laura_Elena_Harring
'3204': Laura_Flessel
'3205': Laura_Gobai
'3206': Laura_Hernandez
'3207': Laura_Linney
'3208': Laura_Marlow
'3209': Laura_Morante
'3210': Laura_Pausini
'3211': Laura_Romero
'3212': Laura_Schlessinger
'3213': Laura_Ziskin
'3214': Laurel_Clark
'3215': Lauren_Hutton
'3216': Lauren_Killian
'3217': Laurence_Fishburne
'3218': Laurence_Tribe
'3219': Laurent_Gbagbo
'3220': Laurent_Jalabert
'3221': Laurent_Woulzy
'3222': Laurie_Chan
'3223': Laurie_Hobbs
'3224': Laurie_Laychak
'3225': Laurie_Pirtle
'3226': Lawrence_Di_Rita
'3227': Lawrence_Foley
'3228': Lawrence_MacAulay
'3229': Lawrence_Roberts
'3230': Lawrence_Vito
'3231': Lazaro_Castro
'3232': LeAnn_Rimes
'3233': LeBron_James
'3234': LeRoy_Millette_Jr
'3235': Lea_Fastow
'3236': Leah_Remini
'3237': Leander_Paes
'3238': Leandrinho_Barbosa
'3239': Leandro_Andrade
'3240': Leandro_Garcia
'3241': Lech_Walesa
'3242': Lee_Ann_Knight
'3243': Lee_Ann_Terlaji
'3244': Lee_Ann_Womack
'3245': Lee_Baca
'3246': Lee_Byung-woong
'3247': Lee_Chang-dong
'3248': Lee_Hoi-chang
'3249': Lee_Hong-ki
'3250': Lee_Hyung-taik
'3251': Lee_Jun
'3252': Lee_Nam-shin
'3253': Lee_Soo-hyuck
'3254': Lee_Tae-sik
'3255': Lee_Yeo-jin
'3256': Lee_Yuan-tseh
'3257': Leigh_Winchell
'3258': Leisel_Jones
'3259': Lela_Rochon
'3260': Leland_Chapman
'3261': Lemuel_Montulo
'3262': Len_Jenoff
'3263': Lena_Katina
'3264': Lena_Olin
'3265': Lene_Espersen
'3266': Leni_Bjorklund
'3267': Lennart_Johansson
'3268': Lennox_Lewis
'3269': Lenny_Kravitz
'3270': Lenny_Wilkens
'3271': Leo_Mullin
'3272': Leo_Ramirez
'3273': Leon_Barmore
'3274': Leon_LaPorte
'3275': Leon_Lai
'3276': Leon_Silver
'3277': Leonard_Glick
'3278': Leonard_Hamilton
'3279': Leonard_Schrank
'3280': Leonardo_Del_Vecchio
'3281': Leonardo_DiCaprio
'3282': Leonardo_Fernandez
'3283': Leonid_Kuchma
'3284': Lesia_Burlak
'3285': Lesley_Coppin
'3286': Lesley_Flood
'3287': Lesley_McCulloch
'3288': Leslie_Ann_Woodward
'3289': Leslie_Caldwell
'3290': Leslie_Moonves
'3291': Leslie_Wiser_Jr
'3292': Lester_Holt
'3293': Leszek_Miller
'3294': Leticia_Dolera
'3295': Leticia_Van_de_Putte
'3296': Leuris_Pupo
'3297': Lew_Rywin
'3298': Lewis_Booth
'3299': Li_Changchun
'3300': Li_Ka-shing
'3301': Li_Peng
'3302': Li_Ruihuan
'3303': Li_Zhaoxing
'3304': Liam_Neeson
'3305': Liane_Janda
'3306': Lidija_Djukanovic
'3307': Lili_Marinho
'3308': Lili_Taylor
'3309': Liliana_Cavani
'3310': Lily_Safra
'3311': Lily_Tomlin
'3312': Lim_Dong-won
'3313': Lima_Azimi
'3314': Lin_Yi-fu
'3315': Lin_Yung_Hsi
'3316': Lina_Krasnoroutskaya
'3317': Lincoln_Chafee
'3318': Linda_Amicangioli
'3319': Linda_Baboolal
'3320': Linda_Dano
'3321': Linda_Franklin
'3322': Linda_Ham
'3323': Linda_Lingle
'3324': Linda_Mason
'3325': Linda_Sanchez
'3326': Lindsay_Benko
'3327': Lindsay_Davenport
'3328': Lindsay_Lohan
'3329': Lindsey_Graham
'3330': Lindy_Ruff
'3331': Linn_Thornton
'3332': Lino_Oviedo
'3333': Linus_Roache
'3334': Lionel_Chalmers
'3335': Lionel_Hampton
'3336': Lionel_Richie
'3337': Lisa_Girman
'3338': Lisa_Gottsegen
'3339': Lisa_Leslie
'3340': Lisa_Ling
'3341': Lisa_Marie_Presley
'3342': Lisa_Murkowski
'3343': Lisa_Raymond
'3344': Lisa_Stansfield
'3345': Lisa_Stone
'3346': Liu_Mingkang
'3347': Liu_Xiaoqing
'3348': Liu_Ye
'3349': Liv_Tyler
'3350': Liza_Minnelli
'3351': Lleyton_Hewitt
'3352': Lloyd_Mudiwa
'3353': Lloyd_Novick
'3354': Lloyd_Richards
'3355': Lloyd_Ward
'3356': Lois_Smart
'3357': Lokendra_Bahadur_Chand
'3358': Lon_Kruger
'3359': Lonnie_Donegan
'3360': Lope_Mendoza
'3361': Lord_Hutton
'3362': Loretta_Lynn_Harper
'3363': Lori_Berenson
'3364': Lorne_Michaels
'3365': Lorraine_Bracco
'3366': Lorraine_Fenton
'3367': Lou_Lang
'3368': Lou_Piniella
'3369': Lou_Reed
'3370': Lou_Ye
'3371': Louis_Van_Gaal
'3372': Louisa_Baileche
'3373': Lubomir_Zaoralek
'3374': Luc_Montagnier
'3375': Luca_Cordero_di_Montezemolo
'3376': Lucas_Wysocki
'3377': Lucia_Kenny_Anthony
'3378': Luciano_Bovicelli
'3379': Luciano_Pavarotti
'3380': Lucie_Lapovsky
'3381': Lucio_Angulo
'3382': Lucio_Cecchinello
'3383': Lucio_Gutierrez
'3384': Lucio_Stanca
'3385': Lucrecia_Orozco
'3386': Lucy_Liu
'3387': Ludivine_Sagnier
'3388': Ludwig_Ovalle
'3389': Luis_Berrondo
'3390': Luis_Ernesto_Derbez_Bautista
'3391': Luis_Figo
'3392': Luis_Fonsi
'3393': Luis_Gonzalez
'3394': Luis_Gonzalez_Macchi
'3395': Luis_Guzman
'3396': Luis_Horna
'3397': Luis_Pujols
'3398': Luis_Rosario_Huertas
'3399': Luis_Sanchez
'3400': Luiz_Felipe_Scolari
'3401': Luiz_Inacio_Lula_da_Silva
'3402': Luke_Ridnour
'3403': Luke_Smith
'3404': Luke_Walton
'3405': Luo_Linquan
'3406': Luther_Htu
'3407': Lutz_Freitag
'3408': Lydia_Shum
'3409': Lyle_Lovett
'3410': Lyle_Vanclief
'3411': Lynn_Abraham
'3412': Lynn_Redgrave
'3413': Lynne_Cheney
'3414': Lynne_Slepian
'3415': Lynne_Thigpen
'3416': Lyudmila_Putin
'3417': MC_Hammer
'3418': Mack_Brown
'3419': Madeleine_Albright
'3420': Madeleine_Webber
'3421': Madge_Overhouse
'3422': Madonna
'3423': Mae_Jemison
'3424': Magda_Kertasz
'3425': Magdalena_Maleeva
'3426': Maggie_Cheung
'3427': Maggie_Smith
'3428': Magui_Serna
'3429': Maha_Habib
'3430': Mahathir_Mohamad
'3431': Mahdi_Al_Bassam
'3432': Mahendra_Chaudhry
'3433': Mahima_Chaudhari
'3434': Mahmoud_Abbas
'3435': Mahmoud_Al_Zhar
'3436': Mahmoud_Diyab_al-Ahmed
'3437': Makhdoom_Amin_Fahim
'3438': Makiko_Tanaka
'3439': Makiya_Ali_Hassan
'3440': Malak_Habbak
'3441': Malcolm_Glazer
'3442': Malcolm_Jamal_Warner
'3443': Malcolm_Wild
'3444': Malik_Mahmud
'3445': Mamdouh_Habib
'3446': Manfred_Reyes_Villa
'3447': Manfred_Stolpe
'3448': Manijeh_Hekmat
'3449': Manuel_Gehring
'3450': Manuel_Jesus
'3451': Manuel_Llorente
'3452': Manuel_Pellegrini
'3453': Manuel_Poggiali
'3454': Manuel_Rosendo
'3455': Manuela_Montebrun
'3456': Mara_Georges
'3457': Marat_Safin
'3458': Marc-Andre_Fleury
'3459': Marc_Anthony
'3460': Marc_Bulger
'3461': Marc_Gold
'3462': Marc_Grossman
'3463': Marc_Leger
'3464': Marc_Racicot
'3465': Marc_Shaiman
'3466': Marcella_Anderson
'3467': Marcelo_Bielsa
'3468': Marcelo_Ebrard
'3469': Marcelo_Rios
'3470': Marcelo_Salas
'3471': Marcio_de_Souza
'3472': Marco_Antonio_Barrera
'3473': Marco_Archer_Cardoso_Moreira
'3474': Marco_Irizarry
'3475': Marco_Pantani
'3476': Marcos_Cafu
'3477': Marcos_Daniel_Jimenez
'3478': Marcos_Milinkovic
'3479': Marcus_Allen
'3480': Marcus_Garrettson
'3481': Marcus_Gronholm
'3482': Margaret_Caruso
'3483': Margaret_Hasley
'3484': Margaret_Hoelzer
'3485': Margaret_Okayo
'3486': Margaret_Thatcher
'3487': Margerry_Bakley
'3488': Margie_Puente
'3489': Maria_Bello
'3490': Maria_Burks
'3491': Maria_Callas
'3492': Maria_Conchita_Alonso
'3493': Maria_Garcia
'3494': Maria_Guida
'3495': Maria_Luisa_Mendonca
'3496': Maria_Sanchez_Lorenzo
'3497': Maria_Shkolnikova
'3498': Maria_Shriver
'3499': Maria_Simon
'3500': Maria_Soledad_Alvear_Valenzuela
'3501': Maria_Wetterstrand
'3502': Mariah_Carey
'3503': Mariam_Ali_Hassan
'3504': Marian_Dolan
'3505': Mariana_Gonzalez
'3506': Mariana_Ohata
'3507': Mariana_Pollack
'3508': Mariangel_Ruiz_Torrealba
'3509': Marianne_Stanley
'3510': Mariano_Zabaleta
'3511': Maribel_Dominguez
'3512': Marie-Josee_Croze
'3513': Marie-Reine_Le_Gougne
'3514': Marie_Haghal
'3515': Marieta_Chrousala
'3516': Marilyn_Monroe
'3517': Marina_Anissina
'3518': Marina_Canetti
'3519': Marina_Hands
'3520': Marina_Kuptsova
'3521': Marina_Silva
'3522': Mario_Alfaro-Lopez
'3523': Mario_Austin
'3524': Mario_Cipollini
'3525': Mario_Dominguez
'3526': Mario_Dumont
'3527': Mario_Gallegos
'3528': Mario_Jardel
'3529': Mario_Kreutzberger
'3530': Mario_Lemieux
'3531': Mario_Lobo_Zagallo
'3532': Mario_Puzo
'3533': Mario_Vasquez_Rana
'3534': Marion_Barry
'3535': Marion_Fahnestock
'3536': Marisa_Tomei
'3537': Marisol_Breton
'3538': Marisol_Martinez_Sambran
'3539': Marissa_Jaret_Winokur
'3540': Maritza_Macias_Furano
'3541': Mark_Andrew
'3542': Mark_Bellhorn
'3543': Mark_Broxmeyer
'3544': Mark_Butcher
'3545': Mark_Cuban
'3546': Mark_Dacey
'3547': Mark_Everson
'3548': Mark_Foley
'3549': Mark_Gangloff
'3550': Mark_Geragos
'3551': Mark_Gottfried
'3552': Mark_Hamister
'3553': Mark_Hanson
'3554': Mark_Heller
'3555': Mark_Hogan
'3556': Mark_Hurlbert
'3557': Mark_Kelly
'3558': Mark_Komara
'3559': Mark_Lazarus
'3560': Mark_Leno
'3561': Mark_Mariscal
'3562': Mark_Martin
'3563': Mark_McClellan
'3564': Mark_Mishkin
'3565': Mark_Mulder
'3566': Mark_Philippoussis
'3567': Mark_Podlesny
'3568': Mark_Polansky
'3569': Mark_Redman
'3570': Mark_Richt
'3571': Mark_Rosenbaum
'3572': Mark_Sacco
'3573': Mark_Salter
'3574': Mark_Schweiker
'3575': Mark_Shapiro
'3576': Mark_Sisk
'3577': Mark_Stuart
'3578': Mark_Swartz
'3579': Mark_Wahlberg
'3580': Mark_Warner
'3581': Markus_Beyer
'3582': Markus_Naslund
'3583': Marlene_Weingartner
'3584': Marlon_Devonish
'3585': Marquier_Montano_Contreras
'3586': Marquis_Estill
'3587': Marricia_Tate
'3588': Marsah_Ambrosius
'3589': Marsha_Sharp
'3590': Marsha_Thomason
'3591': Marta_Dominguz
'3592': Martha_Beatriz_Roque
'3593': Martha_Bowen
'3594': Martha_Burk
'3595': Martha_Lucia_Ramirez
'3596': Martha_Martinez_Flores
'3597': Martha_Sahagun_de_Fox
'3598': Martha_Smith
'3599': Martha_Stewart
'3600': Martie_Maguire
'3601': Martin_Bandier
'3602': Martin_Boryczewski
'3603': Martin_Brodeur
'3604': Martin_Brooke
'3605': Martin_Burnham
'3606': Martin_Cauchon
'3607': Martin_Frost
'3608': Martin_Gecht
'3609': Martin_Hoellwarth
'3610': Martin_Howard
'3611': Martin_Keown
'3612': Martin_Kristof
'3613': Martin_Landau
'3614': Martin_Lawrence
'3615': Martin_Luther_King_III
'3616': Martin_McCauley
'3617': Martin_McGuinness
'3618': Martin_ONeill
'3619': Martin_Rodriguez
'3620': Martin_Scorsese
'3621': Martin_Sheen
'3622': Martin_Short
'3623': Martin_Torrijos
'3624': Martin_Verkerk
'3625': Martina_Hingis
'3626': Martina_McBride
'3627': Marty_Mornhinweg
'3628': Marvan_Atapattu
'3629': Marwan_Barghouthi
'3630': Marwan_Muasher
'3631': Mary-Kate_Olsen
'3632': Mary_Anne_Souza
'3633': Mary_Blige
'3634': Mary_Bono
'3635': Mary_Carey
'3636': Mary_Catherine_Correll
'3637': Mary_Descenza
'3638': Mary_Elizabeth_Mastrantonio
'3639': Mary_Frances_Seiter
'3640': Mary_Hill
'3641': Mary_Jo_Myers
'3642': Mary_Katherine_Smart
'3643': Mary_Landrieu
'3644': Mary_Lou_Markakis
'3645': Mary_Lou_Retton
'3646': Mary_Maddux
'3647': Mary_Matalin
'3648': Mary_McCarty
'3649': Mary_Robinson
'3650': Mary_Steenburgen
'3651': Mary_Sue_Coleman
'3652': Mary_Tyler_Moore
'3653': Mary_Zorn
'3654': Maryn_McKenna
'3655': Masahiko_Nagasawa
'3656': Masamori_Tokuyama
'3657': Masao_Azuma
'3658': Masaru_Hayami
'3659': Masatoshi_Koshiba
'3660': Masja_Juel
'3661': Massoud_Barzani
'3662': Masum_Turker
'3663': Mathias_Reichhold
'3664': Mathilda_Karel_Spak
'3665': Matt_Anderson
'3666': Matt_Braker
'3667': Matt_Damon
'3668': Matt_Dillon
'3669': Matt_Doherty
'3670': Matt_Herden
'3671': Matt_LeBlanc
'3672': Matt_Morris
'3673': Matt_Roney
'3674': Matt_Siebrandt
'3675': Matt_Walters
'3676': Matt_Welsh
'3677': Matthew_Broderick
'3678': Matthew_During
'3679': Matthew_McConaughey
'3680': Matthew_Ouimet
'3681': Matthew_Perry
'3682': Matthew_Vaughan
'3683': Matthias_Sammer
'3684': Maura_Tierney
'3685': Maureen_Fanning
'3686': Maureen_Kanka
'3687': Maurice_Cheeks
'3688': Maurice_Papon
'3689': Maurice_Strong
'3690': Mauricio_Macri
'3691': Mauricio_Pochetino
'3692': Mauro_Viza
'3693': Max_Baucus
'3694': Max_Biaggi
'3695': Max_Mayfield
'3696': Max_Mosley
'3697': Max_von_Sydow
'3698': Maxim_Afinogenov
'3699': Mayumi_Moriyama
'3700': McGuire_Gibson
'3701': Meg_Mallon
'3702': Meg_Wakeman
'3703': Megan_Mullally
'3704': Megawati_Sukarnoputri
'3705': Meghann_Shaughnessy
'3706': Mehdi_Baala
'3707': Mehmet_Ali_Sahin
'3708': Mehmet_Okur
'3709': Meirion_Evans
'3710': Mekhi_Phifer
'3711': Mel_Brooks
'3712': Mel_Gibson
'3713': Mel_Karmazin
'3714': Melana_Scantlin
'3715': Melanie_Griffith
'3716': Melchor_Cob_Castro
'3717': Meles_Zenawi
'3718': Melina_Kanakaredes
'3719': Melinda_Czink
'3720': Melissa_Etheridge
'3721': Melissa_Gilbert
'3722': Melissa_Joan_Hart
'3723': Melissa_Manchester
'3724': Melissa_Mulloy
'3725': Melissa_Stark
'3726': Melvin_Talbert
'3727': Mercedes_Amor
'3728': Meryl_Streep
'3729': Mesut_Yilmaz
'3730': Mia_Mottley
'3731': Mian_Khursheed_Mehmood_Kasuri
'3732': Micah_Knorr
'3733': Michael_Adams
'3734': Michael_Andretti
'3735': Michael_Arif
'3736': Michael_Ballack
'3737': Michael_Bloomberg
'3738': Michael_Bolton
'3739': Michael_Bouchard
'3740': Michael_Boyce
'3741': Michael_Brandon
'3742': Michael_Broad
'3743': Michael_Caine
'3744': Michael_Capellas
'3745': Michael_Chang
'3746': Michael_Chertoff
'3747': Michael_Chiklis
'3748': Michael_Clarke_Duncan
'3749': Michael_DeMinico
'3750': Michael_Dell
'3751': Michael_Denzel
'3752': Michael_Deutsch
'3753': Michael_Diekmann
'3754': Michael_Doleac
'3755': Michael_Donovan
'3756': Michael_Douglas
'3757': Michael_Fitzgerald
'3758': Michael_Frayn
'3759': Michael_Friedman
'3760': Michael_Goldrich
'3761': Michael_Guiler
'3762': Michael_Hagee
'3763': Michael_Haneke
'3764': Michael_Hoffa
'3765': Michael_J_Fox
'3766': Michael_J_Sheehan
'3767': Michael_Jackson
'3768': Michael_Jasny
'3769': Michael_Jordan
'3770': Michael_Kahn
'3771': Michael_Keaton
'3772': Michael_Killeen
'3773': Michael_Kirby
'3774': Michael_Kors
'3775': Michael_Kostelnik
'3776': Michael_Leavitt
'3777': Michael_Lechner
'3778': Michael_Linscott
'3779': Michael_Lopez-Alegria
'3780': Michael_McNeely
'3781': Michael_Michele
'3782': Michael_Milton
'3783': Michael_Moore
'3784': Michael_Munoz
'3785': Michael_Olowokandi
'3786': Michael_Patrick_King
'3787': Michael_Peat
'3788': Michael_Pfleger
'3789': Michael_Phelps
'3790': Michael_Piuze
'3791': Michael_Powell
'3792': Michael_Richards
'3793': Michael_Rolinee
'3794': Michael_Schumacher
'3795': Michael_Shane_Jolly
'3796': Michael_Sheehan
'3797': Michael_Shelby
'3798': Michael_Smith_Foster
'3799': Michael_Stark
'3800': Michael_Sullivan
'3801': Michael_Taylor
'3802': Michael_Wayne
'3803': Michael_Weiss
'3804': Michael_Winterbottom
'3805': Michalis_Chrisohoides
'3806': Micheal_Jourdain_Jr
'3807': Michel_Charles_Chretien
'3808': Michel_Duclos
'3809': Michel_Kratochvil
'3810': Michel_Minard
'3811': Michel_Temer
'3812': Michel_Therrien
'3813': Michelangelo_Antonioni
'3814': Michele_Placido
'3815': Michelle_Bachelet
'3816': Michelle_Branch
'3817': Michelle_Chiklis
'3818': Michelle_Collins
'3819': Michelle_Hofland
'3820': Michelle_Kwan
'3821': Michelle_Lecky
'3822': Michelle_Pfeiffer
'3823': Michelle_Rodriguez
'3824': Michelle_Yeoh
'3825': Mick_Jagger
'3826': Mick_McCarthy
'3827': Mickey_Gilley
'3828': Mickey_Loomis
'3829': Mickey_Rooney
'3830': Mickey_Sherman
'3831': Micky_Arison
'3832': Micky_Ward
'3833': Miguel_Aldana_Ibarra
'3834': Miguel_Angel_Rodriguez
'3835': Miguel_Contreras
'3836': Miguel_Cotto
'3837': Miguel_Estrada
'3838': Miguel_Hakim
'3839': Miguel_Jimenez
'3840': Miguel_Juarez_Perez
'3841': Miguel_Rosseto
'3842': Mika_Hakkinen
'3843': Mike_Alden
'3844': Mike_Babcock
'3845': Mike_Bair
'3846': Mike_Brey
'3847': Mike_Bryan
'3848': Mike_Carona
'3849': Mike_Cunning
'3850': Mike_Davis
'3851': Mike_Duke
'3852': Mike_Easley
'3853': Mike_Eskew
'3854': Mike_Farrar
'3855': Mike_Fisher
'3856': Mike_Flanagan
'3857': Mike_Gable
'3858': Mike_Helton
'3859': Mike_Holmgren
'3860': Mike_Johanns
'3861': Mike_Krzyzewski
'3862': Mike_Leach
'3863': Mike_Maroth
'3864': Mike_Martz
'3865': Mike_Matheny
'3866': Mike_Matthews
'3867': Mike_Miller
'3868': Mike_Montgomery
'3869': Mike_Myers
'3870': Mike_OConnell
'3871': Mike_Price
'3872': Mike_Richter
'3873': Mike_Samp
'3874': Mike_Scioscia
'3875': Mike_Sherman
'3876': Mike_Slive
'3877': Mike_Smith
'3878': Mike_Stefanik
'3879': Mike_Sweeney
'3880': Mike_Szymanczyk
'3881': Mike_Thibault
'3882': Mike_Tice
'3883': Mike_Tyson
'3884': Mike_Webster
'3885': Mike_Weir
'3886': Mikhail_Gorbachev
'3887': Mikhail_Kalashnikov
'3888': Mikhail_Kasyanov
'3889': Mikhail_Khodorkovsky
'3890': Mikhail_Shvydkoi
'3891': Mikhail_Wehbe
'3892': Mikhail_Youzhny
'3893': Mikulas_Dzurinda
'3894': Milan_Kucan
'3895': Milan_Milutinovic
'3896': Mile_Mrksic
'3897': Miles_Stewart
'3898': Millicent_Martin
'3899': Milo_Djukanovic
'3900': Milo_Maestrecampo
'3901': Milt_Heflin
'3902': Milt_Palacio
'3903': Milton_Berle
'3904': Milton_Wynants
'3905': Minnie_Driver
'3906': Minnie_Mendoza
'3907': Mira_Sorvino
'3908': Miranda_Gaddis
'3909': Miranda_Otto
'3910': Mireille_Jospin-Dandieu
'3911': Mirela_Manjani
'3912': Mireya_Elisa_Moscoso_Rodriguez
'3913': Mireya_Moscoso
'3914': Miroljub
'3915': Missy_Crider
'3916': Misty_Dawn_Clymer
'3917': Mitar_Rasevic
'3918': Mitch_Kupchak
'3919': Mitchell_Crooks
'3920': Mitchell_Daniels
'3921': Mitchell_Garabedian
'3922': Mitchell_McLaughlin
'3923': Mitchell_Potter
'3924': Mitchell_Swartz
'3925': Mitoji_Yabunaka
'3926': Mitsou_Gelinas
'3927': Mitt_Romney
'3928': Mitzi_Gaynor
'3929': Miyako_Miyazaki
'3930': Mladen_Naletilic
'3931': Mo_Elleithee
'3932': Moby
'3933': Mohamed_Benaissa
'3934': Mohamed_ElBaradei
'3935': Mohamed_Hammam
'3936': Mohamed_Seineldin
'3937': Mohammad_Aktar
'3938': Mohammad_Al-Sharief
'3939': Mohammad_Fares
'3940': Mohammad_Hasanein
'3941': Mohammad_Khatami
'3942': Mohammad_Mustapha_Miro
'3943': Mohammaed_Ahmad_Al_Jarallah
'3944': Mohammed_Abu_Sharia
'3945': Mohammed_Abulhasan
'3946': Mohammed_Al-Douri
'3947': Mohammed_Al_Hindi
'3948': Mohammed_Ashraf_Hafiz
'3949': Mohammed_Baqir_al-Hakim
'3950': Mohammed_Dahlan
'3951': Mohammed_Mehdi_Saleh
'3952': Mohammed_Salmane
'3953': Molly_Sims
'3954': Momcilo_Perisic
'3955': Momir_Nikolic
'3956': Mona_Ayoub
'3957': Mona_Locke
'3958': Mona_Rishmawi
'3959': Monica_Bellucci
'3960': Monica_Gabrielle
'3961': Monica_Lewinsky
'3962': Monica_Seles
'3963': Monica_Serra
'3964': Monica_Vergara
'3965': Monique_Ferreira
'3966': Monique_Gagnon-Tremblay
'3967': Monique_Garbrecht-Enfeldt
'3968': Monte_Kiffin
'3969': Moon-So-ri
'3970': Morgan_Fairchild
'3971': Morgan_Freeman
'3972': Morgan_Hentzen
'3973': Morris_Dees
'3974': Morris_Watts
'3975': Moshe_Katsav
'3976': Mother_Teresa
'3977': Ms_Dynamite
'3978': Mstislav_Rostropovich
'3979': Muammar_Gaddafi
'3980': Muffet_McGraw
'3981': Mufti_Mohammad_Syed
'3982': Muhammad_Ali
'3983': Muhammad_Ibrahim_Bilal
'3984': Muhammad_Saeed_al-Sahhaf
'3985': Mukesh_Ambani
'3986': Mukhtar_Alytnbayev
'3987': Munir_Akram
'3988': Muwafak_al-Ani
'3989': Myung_Yang
'3990': Na_Na_Keum
'3991': Nabil_Shaath
'3992': Nadia_Forte
'3993': Nadia_Petrova
'3994': Nadine_Vinzens
'3995': Naji_Sabri
'3996': Najib_al-Salhi
'3997': Namuddu_Florence
'3998': Nan_Wang
'3999': Nancy_Demme
'4000': Nancy_Greenlaw
'4001': Nancy_Humbert
'4002': Nancy_Kerrigan
'4003': Nancy_Pelosi
'4004': Nancy_Powell
'4005': Nancy_Reagan
'4006': Nancy_Sinatra
'4007': Nancy_Smith
'4008': Nanni_Moretti
'4009': Naomi_Bronstein
'4010': Naomi_Campbell
'4011': Naomi_Hayashi
'4012': Naomi_Watts
'4013': Naoto_Kan
'4014': Narayan_Singh_Pun
'4015': Narendra_Modi
'4016': Nasser_al-Kidwa
'4017': Nastassia_Kinski
'4018': Nastia_Liukin
'4019': Natalia_Dmitrieva
'4020': Natalia_Motuziuk
'4021': Natalia_Verbeke
'4022': Natalia_Vodonova
'4023': Natalie_Cole
'4024': Natalie_Coughlin
'4025': Natalie_Imbruglia
'4026': Natalie_Juniardi
'4027': Natalie_Maines
'4028': Natalie_Stewart
'4029': Natalie_Williams
'4030': Natalya_Sazanovich
'4031': Natanaela_Barnova
'4032': Natasa_Micic
'4033': Natasha_Henstridge
'4034': Natasha_Lyonne
'4035': Natasha_McElhone
'4036': Nate_Blackwell
'4037': Nate_Huffman
'4038': Nate_Hybl
'4039': Nathalia_Gillot
'4040': Nathalie_Baye
'4041': Nathalie_Dechy
'4042': Nathalie_Gagnon
'4043': Nathan_Doudney
'4044': Nathan_Lane
'4045': Nathan_Powell
'4046': Nathan_Smith
'4047': Nathirah_Hussein
'4048': Nawabzada_Nasrullah_Khan
'4049': Nebojsa_Pavkovic
'4050': Neil_Goldman
'4051': Neil_Moritz
'4052': Nelson_Acosta
'4053': Nelson_Mandela
'4054': Nelson_Shanks
'4055': Neri_Marcore
'4056': Nestor_Gonzalez
'4057': Nestor_Kirchner
'4058': Nestor_Santillan
'4059': Newt_Gingrich
'4060': Newton_Carlton_Slawson
'4061': Nia_Vardalos
'4062': Niall_Connolly
'4063': Nicanor_Duarte_Frutos
'4064': Nicholas_Byron
'4065': Nicholas_Tse
'4066': Nicholoas_DiMarzio
'4067': Nick_Cassavetes
'4068': Nick_Markakis
'4069': Nick_Nolte
'4070': Nick_Price
'4071': Nick_Rahall
'4072': Nick_Reilly
'4073': Nick_Rimando
'4074': Nick_Turner
'4075': Nicklas_Lidstrom
'4076': Nicola_Bono
'4077': Nicola_Wells
'4078': Nicolas_Cage
'4079': Nicolas_Escude
'4080': Nicolas_Eyzaguirre
'4081': Nicolas_Kiefer
'4082': Nicolas_Lapentti
'4083': Nicolas_Latorre
'4084': Nicolas_Macrozonaris
'4085': Nicolas_Massu
'4086': Nicolas_Sarkozy
'4087': Nicole
'4088': Nicole_Hiltz
'4089': Nicole_Kidman
'4090': Nicole_Parker
'4091': Nicoletta_Braschi
'4092': Nida_Blanca
'4093': Nigel_Redden
'4094': Nikki_Cascone
'4095': Nikki_McKibbin
'4096': Nikki_Reed
'4097': Nikki_Teasley
'4098': Nikolay_Davydenko
'4099': Nila_Ferran
'4100': Nina_Jacobson
'4101': Nina_Pecari
'4102': Nino_DAngelo
'4103': Nizar_Trabelsi
'4104': Noah_Wyle
'4105': Nobuyuki_Idei
'4106': Noel_Forgeard
'4107': Noel_Niell
'4108': Noelle_Bush
'4109': Noer_Moeis
'4110': Noer_Muis
'4111': Nona_Gaye
'4112': Nong_Duc_Manh
'4113': Noor_Mohammed
'4114': Nora_Bendijo
'4115': Nora_Ephron
'4116': Norah_Jones
'4117': Norbert_van_Heyst
'4118': Norio_Ohga
'4119': Norm_Coleman
'4120': Norm_Macdonald
'4121': Norman_Jewison
'4122': Norman_Mailer
'4123': Norman_Mineta
'4124': Normand_Legault
'4125': Norodom_Chakrapong
'4126': Norodom_Sihanouk
'4127': Nova_Esther_Guthrie
'4128': Nuon_Chea
'4129': Nur_Jaafar
'4130': Nursultan_Nazarbayev
'4131': OJ_Simpson
'4132': Octavio_Lara
'4133': Odai_Hussein
'4134': Odilia_Collazo
'4135': Oleg_Romantsev
'4136': Oleksandr_Moroz
'4137': Olene_Walker
'4138': Olesya_Bonabarenko
'4139': Oliver_Neuville
'4140': Oliver_Phelps
'4141': Oliver_Stone
'4142': Olivera_Labus
'4143': Olivia_Newton-John
'4144': Olivier_Boulay
'4145': Olivier_Rochus
'4146': Olympia_Dukakis
'4147': Omar_Khan_Sharif
'4148': Omar_Sharif
'4149': Omar_Vizquel
'4150': Omar_el-Heib
'4151': Ontario_Lett
'4152': Oprah_Winfrey
'4153': Oracene_Williams
'4154': Orlando_Bloom
'4155': Ornella_Muti
'4156': Orrin_Hatch
'4157': Osama_Al_Baz
'4158': Osama_bin_Laden
'4159': Oscar_Bolanos
'4160': Oscar_DLeon
'4161': Oscar_De_La_Hoya
'4162': Oscar_Elias_Biscet
'4163': Oscar_de_la_Renta
'4164': Osmond_Smith
'4165': Osrat_Iosef
'4166': Oswald_Gruebel
'4167': Oswaldo_Paya
'4168': Otto_Reich
'4169': Otto_Schily
'4170': Owen_Nolan
'4171': Owen_Wilson
'4172': Oxana_Fedorova
'4173': Ozzie_Smith
'4174': Ozzy_Osbourne
'4175': Pa_Kou_Hang
'4176': Pablo_Khulental
'4177': Pablo_Latras
'4178': Paddy_Long
'4179': Paddy_Torsney
'4180': Padraig_Harrington
'4181': Paek_Nam_Sun
'4182': Paige_Fitzgerald
'4183': Pak_Gil_Yon
'4184': Pamela_Anderson
'4185': Pamela_Melroy
'4186': Paola_Espinoza
'4187': Paradorn_Srichaphan
'4188': Paris_Hilton
'4189': Park_Jie-won
'4190': Park_Jung_Sung
'4191': Park_Na-kyong
'4192': Parris_Glendening
'4193': Parthiv_Patel
'4194': Pascal_Affi_Nguessan
'4195': Pascal_Lamy
'4196': Pascal_Quignard
'4197': Pascal_Rheaume
'4198': Pat_Burns
'4199': Pat_Cox
'4200': Pat_DAmuro
'4201': Pat_Riley
'4202': Pat_Rochester
'4203': Pat_Summerall
'4204': Pat_Summitt
'4205': Pat_Wharton
'4206': Patrice_Chereau
'4207': Patricia_Clarkson
'4208': Patricia_Garone
'4209': Patricia_Hearst
'4210': Patricia_Heaton
'4211': Patricia_Johnson
'4212': Patricia_Medina
'4213': Patricia_Phillips
'4214': Patricia_Russo
'4215': Patricia_Wartusch
'4216': Patrick_Bourrat
'4217': Patrick_Clawsen
'4218': Patrick_Coleman
'4219': Patrick_Dennehy
'4220': Patrick_Eaves
'4221': Patrick_Ewing
'4222': Patrick_Kron
'4223': Patrick_Leahy
'4224': Patrick_McEnroe
'4225': Patrick_Rafter
'4226': Patrick_Roy
'4227': Patrick_Stewart
'4228': Patrik_Kristiansson
'4229': Patsy_Hardy
'4230': Patsy_Kensit
'4231': Patsy_Mink
'4232': Patti_Balgojevich
'4233': Patti_Labelle
'4234': Patti_Lank
'4235': Patti_Smith
'4236': Patty_Duke
'4237': Patty_Schnyder
'4238': Patty_Sheehan
'4239': Paul-Henri_Mathieu
'4240': Paul_Bettany
'4241': Paul_Brandt
'4242': Paul_Bremer
'4243': Paul_Burrell
'4244': Paul_Byrd
'4245': Paul_Celluci
'4246': Paul_Cerjan
'4247': Paul_Clark
'4248': Paul_Coppin
'4249': Paul_Crake
'4250': Paul_Desmarais
'4251': Paul_Ebert
'4252': Paul_Farley
'4253': Paul_Gannon
'4254': Paul_Gascoigne
'4255': Paul_Greengrass
'4256': Paul_Henderson
'4257': Paul_Hogan
'4258': Paul_Johnson
'4259': Paul_Kagame
'4260': Paul_Kariya
'4261': Paul_Kelleher
'4262': Paul_Krueger
'4263': Paul_LeClerc
'4264': Paul_Li_Calsi
'4265': Paul_Lo_Duca
'4266': Paul_Lockhart
'4267': Paul_Luvera
'4268': Paul_Martin
'4269': Paul_McCartney
'4270': Paul_McNulty
'4271': Paul_Michael_Daniels
'4272': Paul_Murphy
'4273': Paul_Newman
'4274': Paul_ONeill
'4275': Paul_Otellini
'4276': Paul_Patton
'4277': Paul_Pierce
'4278': Paul_Reiser
'4279': Paul_Sarbanes
'4280': Paul_Schrader
'4281': Paul_Shanley
'4282': Paul_Tagliabue
'4283': Paul_Tracy
'4284': Paul_Vathis
'4285': Paul_Walker
'4286': Paul_Wals
'4287': Paul_Wellstone
'4288': Paul_William_Hurley
'4289': Paul_Wilson
'4290': Paul_Wolfowitz
'4291': Paul_Wollnough
'4292': Paula_Abdul
'4293': Paula_Dobriansky
'4294': Paula_Locke
'4295': Paula_Prentiss
'4296': Paula_Radcliffe
'4297': Paula_Zahn
'4298': Pauley_Perrette
'4299': Paulie_Ayala
'4300': Paulina_Rodriguez_Davila
'4301': Pauline_Landers
'4302': Pauline_Phillips
'4303': Paulo_Cesar_Pinheiro
'4304': Pedro_Almodovar
'4305': Pedro_Alvarez
'4306': Pedro_Duque
'4307': Pedro_Mahecha
'4308': Pedro_Malan
'4309': Pedro_Martinez
'4310': Pedro_Pauleta
'4311': Pedro_Solbes
'4312': Pedro_Velasquez
'4313': Peggy_McGuinness
'4314': Pele
'4315': Penelope_Ann_Miller
'4316': Penelope_Cruz
'4317': Penelope_Taylor
'4318': Penny_Dupuie
'4319': Penny_Lancaster
'4320': Percy_Gibson
'4321': Peri_Gilpin
'4322': Pernilla_Bjorn
'4323': Perri_Shaw
'4324': Perry_Compton
'4325': Perry_Farrell
'4326': Perry_Gibbs
'4327': Pervez_Musharraf
'4328': Pete_Aldridge
'4329': Pete_Beaudrault
'4330': Pete_Carroll
'4331': Pete_Gillen
'4332': Pete_Rose
'4333': Pete_Sampras
'4334': Peter_Ahearn
'4335': Peter_Albertsen
'4336': Peter_Arnett
'4337': Peter_Bacanovic
'4338': Peter_Camejo
'4339': Peter_Care
'4340': Peter_Caruana
'4341': Peter_Chan
'4342': Peter_Costello
'4343': Peter_Fisher
'4344': Peter_Fitzgerald
'4345': Peter_Fonda
'4346': Peter_Gabriel
'4347': Peter_Gilmour
'4348': Peter_Goldmark
'4349': Peter_Greenaway
'4350': Peter_Greenspun
'4351': Peter_Harrison
'4352': Peter_Hartz
'4353': Peter_Harvey
'4354': Peter_Hillary
'4355': Peter_Hollingworth
'4356': Peter_Holmberg
'4357': Peter_Hunt
'4358': Peter_Lundgren
'4359': Peter_Mackay
'4360': Peter_Mansbridge
'4361': Peter_Max
'4362': Peter_Medgyessy
'4363': Peter_Mugyeni
'4364': Peter_Mullan
'4365': Peter_OToole
'4366': Peter_Rasch
'4367': Peter_Rasmussen
'4368': Peter_Schultz
'4369': Peter_Sejna
'4370': Peter_Shaw
'4371': Peter_Struck
'4372': Peter_Ueberroth
'4373': Petria_Thomas
'4374': Petro_Symonenko
'4375': Pham_Sy_Chien
'4376': Pham_Thi_Mai_Phuong
'4377': Phan_Van_Khai
'4378': Pharrell_Williams
'4379': Phil_Bennett
'4380': Phil_Bredesen
'4381': Phil_Cline
'4382': Phil_Cullen
'4383': Phil_Donahue
'4384': Phil_Gramm
'4385': Phil_Jackson
'4386': Phil_Johnson
'4387': Phil_McGraw
'4388': Phil_Mickelson
'4389': Phil_Morris
'4390': Phil_Vassar
'4391': Philip_Cummings
'4392': Philip_Murtaugh
'4393': Philip_Zalewski
'4394': Philippe_Gagnon
'4395': Philippe_Noiret
'4396': Phillip_Fulmer
'4397': Phillip_Seymor_Hoffmann
'4398': Phillipe_Comtois
'4399': Phillips_Idowu
'4400': Phoenix_Chang
'4401': Picabo_Street
'4402': Pier_Ferdinando_Casini
'4403': Pierce_Brosnan
'4404': Pierre_Boulanger
'4405': Pierre_Gagnon
'4406': Pierre_Lacroix
'4407': Pierre_Pettigrew
'4408': Pierre_Png
'4409': Pierre_Van_Hooijdonk
'4410': Piers_Sellers
'4411': Pieter_Bouw
'4412': Pilar_Montenegro
'4413': Pinar_del_Rio
'4414': Pio_Laghi
'4415': Piotr_Anderszewski
'4416': Placido_Domingo
'4417': Platon_Lebedev
'4418': Poala_Suarez
'4419': Polona_Bas
'4420': Porter_Goss
'4421': Portia_de_Rossi
'4422': Prakash_Hinduja
'4423': Prem_Kumar_Nair
'4424': Prince_Charles
'4425': Prince_Claus
'4426': Prince_Edward
'4427': Prince_Felipe
'4428': Prince_Harry
'4429': Prince_Naruhito
'4430': Prince_Philippe
'4431': Prince_Rainier_III
'4432': Prince_Willem-Alexander
'4433': Prince_William
'4434': Princess_Aiko
'4435': Princess_Anne
'4436': Princess_Caroline
'4437': Princess_Diana
'4438': Princess_Elisabeth
'4439': Princess_Hisako
'4440': Princess_Masako
'4441': Princess_Maxima
'4442': Princess_Stephanie
'4443': Princess_Victoria
'4444': Pringe_Ernst_August
'4445': Priscilla_Owen
'4446': Priscilla_Presley
'4447': Priyanka_Chopra
'4448': Prospero_Pichay
'4449': Pupi_Avati
'4450': Pyar_Jung_Thapa
'4451': Qais_al-Kazali
'4452': Qazi_Afzal
'4453': Qazi_Hussain_Ahmed
'4454': Qian_Qichen
'4455': Queen_Beatrix
'4456': Queen_Elizabeth_II
'4457': Queen_Latifah
'4458': Queen_Noor
'4459': Queen_Rania
'4460': Queen_Silvia
'4461': Queen_Sofia
'4462': Quin_Snyder
'4463': Quincy_Jones
'4464': Qusai_Hussein
'4465': Raaf_Schefter
'4466': Raag_Singhal
'4467': Rachel_Corrie
'4468': Rachel_Griffiths
'4469': Rachel_Hunter
'4470': Rachel_Kempson
'4471': Rachel_Leigh_Cook
'4472': Rachel_Roy
'4473': Rachel_Wadsworth
'4474': Rachel_Wheatley
'4475': Radovan_Karadzic
'4476': Raf_Vallone
'4477': Rafael_Bielsa
'4478': Rafael_Ramirez
'4479': Rafael_Vinoly
'4480': Rafeeuddin_Ahmed
'4481': Rafidah_Aziz
'4482': Rafiq_Hariri
'4483': Raghad_Saddam_Hussein
'4484': Rahul_Dravid
'4485': Rainer_Geulen
'4486': Rainer_Gut
'4487': Rainer_Schuettler
'4488': Raja_Ibrahim
'4489': Raja_Qureshi
'4490': Raja_Ramani
'4491': Raja_Zafar-ul-Haq
'4492': Ralf_Schumacher
'4493': Ralph_Fiennes
'4494': Ralph_Firman
'4495': Ralph_Friedgen
'4496': Ralph_Goodale
'4497': Ralph_Klein
'4498': Ralph_Lauren
'4499': Ralph_Nader
'4500': Ralph_Sampson
'4501': Ramiro_Goben_Reducindo
'4502': Ramon_Cardenas
'4503': Ramon_Delgado
'4504': Ramon_Ponce_de_Leon
'4505': Ramon_Santana
'4506': Ramona_Rispton
'4507': Rand_Beers
'4508': Rand_Miller
'4509': Randall_Terry
'4510': Randall_Tobias
'4511': Randy_Brown
'4512': Randy_Dryer
'4513': Randy_Ferbey
'4514': Randy_Jackson
'4515': Randy_Johnson
'4516': Randy_Travis
'4517': Rani_Mukherjee
'4518': Ranil_Wickremasinghe
'4519': Raoul_Ruiz
'4520': Raquel_Welch
'4521': Rashid_Qureshi
'4522': Ratna_Sari_Dewi_Sukarno
'4523': Raul_Castaneda
'4524': Raul_Chacon
'4525': Raul_Cubas
'4526': Raul_Gonzalez
'4527': Raul_Ibanez
'4528': Raul_Mondesi
'4529': Raul_Rivero
'4530': Ravan_AG_Farhadi
'4531': Ray_Allen
'4532': Ray_Bradbury
'4533': Ray_Evernham
'4534': Ray_Halbritter
'4535': Ray_Lewis
'4536': Ray_Liotta
'4537': Ray_Lucas
'4538': Ray_Morrough
'4539': Ray_Nagin
'4540': Ray_Price
'4541': Ray_Romano
'4542': Ray_Sherman
'4543': Ray_Wasden
'4544': Ray_Young
'4545': Raymond_Arthurs
'4546': Raymond_Odierno
'4547': Raza_Rabbani
'4548': Razali_Ismail
'4549': Rebecca_Romijn-Stamos
'4550': Rebekah_Chantay_Revels
'4551': Recep_Tayyip_Erdogan
'4552': Red_Auerbach
'4553': Reese_Witherspoon
'4554': Reggie_Lewis
'4555': Reggie_Miller
'4556': Reggie_Sanders
'4557': Regina_Ip
'4558': Reginald_Hudlin
'4559': Reina_Hayes
'4560': Reinhard_Buetikofer
'4561': Ren_Qingjin
'4562': Rena_Sofer
'4563': Renato_Soru
'4564': Rene_Antonio_Leon_Rodriguez
'4565': Rene_Portland
'4566': Renee_Zellweger
'4567': Retief_Goosen
'4568': Rey_Sanchez
'4569': Reyyan_Uzuner
'4570': Rhett_Warrener
'4571': Rhina_Villatoro
'4572': Ricardo_Lagos
'4573': Ricardo_Lopez_Murphy
'4574': Ricardo_Maduro
'4575': Ricardo_Mayorga
'4576': Ricardo_Monasterio
'4577': Ricardo_Sanchez
'4578': Riccardo_Muti
'4579': Rich_Brooks
'4580': Rich_Gannon
'4581': Richard_Armitage
'4582': Richard_Barry
'4583': Richard_Branson
'4584': Richard_Butler
'4585': Richard_Carl
'4586': Richard_Chamberlain
'4587': Richard_Cohen
'4588': Richard_Crenna
'4589': Richard_Daley
'4590': Richard_Dreyfuss
'4591': Richard_Fine
'4592': Richard_Gephardt
'4593': Richard_Gere
'4594': Richard_Greenberg
'4595': Richard_Haass
'4596': Richard_Hamilton
'4597': Richard_Harris
'4598': Richard_Hellfant
'4599': Richard_Jefferson
'4600': Richard_Jewell
'4601': Richard_Krajicek
'4602': Richard_Langille
'4603': Richard_Lennon
'4604': Richard_Levin
'4605': Richard_Lugar
'4606': Richard_Myers
'4607': Richard_Naughton
'4608': Richard_Norton-Taylor
'4609': Richard_Palmer
'4610': Richard_Parsons
'4611': Richard_Paul_Evans
'4612': Richard_Penniman
'4613': Richard_Pennington
'4614': Richard_Perle
'4615': Richard_Regenhard
'4616': Richard_Reid
'4617': Richard_Rodriguez
'4618': Richard_Sambrook
'4619': Richard_Shelby
'4620': Richard_Sterner
'4621': Richard_Tubb
'4622': Richard_Virenque
'4623': Richard_Ward
'4624': Richie_Adubato
'4625': Rick_Barnes
'4626': Rick_Bland
'4627': Rick_Bragg
'4628': Rick_Carlisle
'4629': Rick_Caruso
'4630': Rick_Dinse
'4631': Rick_Husband
'4632': Rick_Lu
'4633': Rick_Perry
'4634': Rick_Pitino
'4635': Rick_Reed
'4636': Rick_Rickert
'4637': Rick_Romley
'4638': Rick_Santorum
'4639': Rick_Stansbury
'4640': Rick_Wagoner
'4641': Ricky_Barnes
'4642': Ricky_Cottrill
'4643': Ricky_Martin
'4644': Ricky_Ponting
'4645': Ricky_Quick
'4646': Ricky_Ray
'4647': Ridley_Scott
'4648': Riek_Blanjaar
'4649': Rien_Long
'4650': Rina_Lazo
'4651': Ringo_Starr
'4652': Rio_Ferdinand
'4653': Rita_Grande
'4654': Rita_Moreno
'4655': Rita_Wilson
'4656': Rob_Lowe
'4657': Rob_Marshall
'4658': Rob_Moore
'4659': Rob_Morrow
'4660': Rob_Niedermayer
'4661': Rob_Ramsay
'4662': Rob_Schneider
'4663': Robbie_Coltrane
'4664': Robbie_Fowler
'4665': Robbie_Mc_Ewen
'4666': Robbie_Naish
'4667': Robbie_Williams
'4668': Robby_Ginepri
'4669': Robert_Altman
'4670': Robert_Beck
'4671': Robert_Blackwill
'4672': Robert_Blake
'4673': Robert_Bonner
'4674': Robert_Bullock
'4675': Robert_DeFraites
'4676': Robert_De_Niro
'4677': Robert_Douglas
'4678': Robert_Downey_Jr
'4679': Robert_Durst
'4680': Robert_Duvall
'4681': Robert_Ehrlich
'4682': Robert_Evans
'4683': Robert_F_Kennedy_Jr
'4684': Robert_Fico
'4685': Robert_Flodquist
'4686': Robert_Gallo
'4687': Robert_Gordon_Card
'4688': Robert_Hanssen
'4689': Robert_Horan
'4690': Robert_Hyatt
'4691': Robert_Kipkoech_Cheruiyot
'4692': Robert_Kocharian
'4693': Robert_Korzeniowski
'4694': Robert_Lange
'4695': Robert_Lee_Yates_Jr
'4696': Robert_Marshall
'4697': Robert_McKee
'4698': Robert_Morvillo
'4699': Robert_Mueller
'4700': Robert_Mugabe
'4701': Robert_Nardelli
'4702': Robert_Nillson
'4703': Robert_Pollack
'4704': Robert_Redford
'4705': Robert_Schuller
'4706': Robert_Stack
'4707': Robert_Torricelli
'4708': Robert_Towne
'4709': Robert_Tyrrell
'4710': Robert_Vowler
'4711': Robert_Wagner
'4712': Robert_Weitzel
'4713': Robert_Wiener
'4714': Robert_Witt
'4715': Robert_Woody_Johnson
'4716': Robert_Zoellick
'4717': Roberta_Combs
'4718': Roberto_Arguelles
'4719': Roberto_Benigni
'4720': Roberto_Canessa
'4721': Roberto_Carlos
'4722': Roberto_Cavalli
'4723': Roberto_Cercelletta
'4724': Roberto_Guaterroma
'4725': Roberto_Laratro
'4726': Roberto_Lavagna
'4727': Roberto_Marinho
'4728': Roberto_Robaina
'4729': Roberto_Tovar
'4730': Robin_Cook
'4731': Robin_Johansen
'4732': Robin_McGraw
'4733': Robin_McLaurin_Williams
'4734': Robin_Tunney
'4735': Robin_Wagner
'4736': Robin_Williams
'4737': Robin_Wright_Penn
'4738': Robinson_Stevenin
'4739': Rocco_Buttiglione
'4740': Rod_Blagojevich
'4741': Rod_Bryden
'4742': Rod_Jong-il
'4743': Rod_Paige
'4744': Rod_Stewart
'4745': Rod_Thorn
'4746': Rodney_Dangerfield
'4747': Rodney_Rempt
'4748': Rodolfo_Abalos
'4749': Rodrigo_Borja
'4750': Rodrigo_Rato
'4751': Rodrigo_de_la_Cerna
'4752': Roel_Campos
'4753': Rogelio_Montemayor
'4754': Rogelio_Ramos
'4755': Roger_Clemens
'4756': Roger_Cook
'4757': Roger_Corbett
'4758': Roger_Daltrey
'4759': Roger_Etchegaray
'4760': Roger_Federer
'4761': Roger_Grimes
'4762': Roger_King
'4763': Roger_Lyons
'4764': Roger_Machado
'4765': Roger_Mahony
'4766': Roger_Moore
'4767': Roger_Penske
'4768': Roger_Staubach
'4769': Roger_Suarez
'4770': Roger_Toussaint
'4771': Roger_Winter
'4772': Rogerio_Romero
'4773': Roh_Moo-hyun
'4774': Rohinton_Mistry
'4775': Rohman_al-Ghozi
'4776': Roland_Koch
'4777': Rolandas_Paksas
'4778': Rolf_Eckrodt
'4779': Rolf_Zimmermann
'4780': Rollie_Massimino
'4781': Romain_Duris
'4782': Roman_Abramovich
'4783': Roman_Coppola
'4784': Roman_Polanski
'4785': Roman_Tam
'4786': Romano_Prodi
'4787': Romario_Farias
'4788': Romeo_Gigli
'4789': Ron_Dittemore
'4790': Ron_Gonzales
'4791': Ron_Howard
'4792': Ron_Kirk
'4793': Ron_Lantz
'4794': Ron_Zook
'4795': Ronald_Brower
'4796': Ronald_Harwood
'4797': Ronald_Ito
'4798': Ronald_Kadish
'4799': Ronald_Kessler
'4800': Ronald_Perelman
'4801': Ronald_Post
'4802': Ronald_Reagan
'4803': Ronald_White
'4804': Ronald_Young_Jr
'4805': Ronaldo_Luis_Nazario_de_Lima
'4806': Ronde_Barber
'4807': Ronnie_Jagday
'4808': Ronnie_Musgrove
'4809': Rosa_Haywa_de_Condori
'4810': Rosalie_Perkov
'4811': Rosalyn_Carter
'4812': Rosario_Dawson
'4813': Rose_Linkins
'4814': Rose_Marie
'4815': Roseanne_Barr
'4816': Rosemarie_Stack
'4817': Rosie_Perez
'4818': Rosny_Desroches
'4819': Ross_Verba
'4820': Rowan_Williams
'4821': Roy_Blunt
'4822': Roy_Chaderton
'4823': Roy_Halladay
'4824': Roy_Jones_Jr
'4825': Roy_Moore
'4826': Roy_Rogers
'4827': Roy_Romanow
'4828': Roy_Williams
'4829': Ruano_Pascual
'4830': Ruben_Sierra
'4831': Ruben_Studdard
'4832': Ruben_Wolkowyski
'4833': Rubens_Barrichello
'4834': Rudi_Voeller
'4835': Rudolf_Schuster
'4836': Rudolph_Giuliani
'4837': Rudolph_Holton
'4838': Rudy_Tomjanovich
'4839': Rulon_Gardner
'4840': Rupert_Grint
'4841': Rupert_Murdoch
'4842': Russ_Ortiz
'4843': Russell_Coutts
'4844': Russell_Crowe
'4845': Russell_Simmons
'4846': Rustu_Recber
'4847': Ruth_Bader_Ginsburg
'4848': Ruth_Christofferson
'4849': Ruth_Dreifuss
'4850': Ruth_Harlow
'4851': Ruth_Pearce
'4852': Ruth_Stubbs
'4853': Ryan_Drese
'4854': Ryan_Goodman
'4855': Ryan_Leaf
'4856': Ryan_Newman
'4857': Ryan_Nyquist
'4858': SJ_Twu
'4859': S_Jayakumar
'4860': Saadi_Gadhafi
'4861': Sabah_Al-Ahmad_Al-Jaber_Al-Sabah
'4862': Saburo_Kawabuchi
'4863': Sachiko_Yamada
'4864': Sachin_Tendulkar
'4865': Sada_Jacobson
'4866': Sadam_Hassan
'4867': Saddam_Hussein
'4868': Sadie_Frost
'4869': Saeb_Erekat
'4870': Saeed_Anwar
'4871': Saeed_Mortazavi
'4872': Sahim_Alwan
'4873': Saied_Hadi_al_Mudarissi
'4874': Sally_Clark
'4875': Sally_Field
'4876': Sally_Kirkland
'4877': Sally_Ride
'4878': Salma_Hayek
'4879': Salman_Khan
'4880': Salman_Rushdie
'4881': Sam_Bith
'4882': Sam_Brownback
'4883': Sam_Gerald
'4884': Sam_Mendes
'4885': Sam_Rockwell
'4886': Sam_Torrance
'4887': Saman_Shali
'4888': Samantha_Daniels
'4889': Samantha_Ledster
'4890': Sami_Al-Arian
'4891': Samira_Makhmalbaf
'4892': Sammy_Knight
'4893': Sammy_Sosa
'4894': Samuel_Waksal
'4895': San_Lan
'4896': Sananda_Maitreya
'4897': Sandra_Banning
'4898': Sandra_Bullock
'4899': Sandra_Ceccarelli
'4900': Sandra_Day_OConner
'4901': Sandra_Milo
'4902': Sandra_Shamas
'4903': Sandy_Smith
'4904': Sandy_Wise
'4905': Sanja_Papic
'4906': Sanjay_Chawla
'4907': Sanjay_Gupta
'4908': Santiago_Botero
'4909': Saoud_Al_Faisal
'4910': Saparmurat_Niyazov
'4911': Sara_Elisabeth_Ahmad
'4912': Sara_Silverman
'4913': Sarah_Canale
'4914': Sarah_Hughes
'4915': Sarah_Jessica_Parker
'4916': Sarah_Michelle_Gellar
'4917': Sarah_Price
'4918': Sarah_Weddington
'4919': Sarah_Wynter
'4920': Sargis_Sargsian
'4921': Sasha_Alexander
'4922': Sasha_Cohen
'4923': Satnarine_Sharma
'4924': Scott_Blum
'4925': Scott_Dalton
'4926': Scott_Dickson
'4927': Scott_Fawell
'4928': Scott_Gorelick
'4929': Scott_Hamilton
'4930': Scott_Hoch
'4931': Scott_Hubbard
'4932': Scott_McClellan
'4933': Scott_McNealy
'4934': Scott_OGrady
'4935': Scott_Peterson
'4936': Scott_Ritter
'4937': Scott_Rolen
'4938': Scott_Rudin
'4939': Scott_Sullivan
'4940': Scott_Verplank
'4941': Scott_Wallach
'4942': Scott_Weiland
'4943': Scott_Wittman
'4944': Scott_Wolf
'4945': Scott_Yates
'4946': Se_Hyuk_Joo
'4947': Sean_Astin
'4948': Sean_Combs
'4949': Sean_Hayes
'4950': Sean_OKeefe
'4951': Sean_Patrick_OMalley
'4952': Sean_Patrick_Thomas
'4953': Sean_Penn
'4954': Sean_Townsend
'4955': Sebastian_Cuattrin
'4956': Sebastian_Porto
'4957': Sebastian_Saja
'4958': Sebastien_Grosjean
'4959': Sedigh_Barmak
'4960': Selma_Phoenix
'4961': Sepp_Blatter
'4962': Serena_Karlan
'4963': Serena_Williams
'4964': Sereyvuth_Kem
'4965': Serge_Klarsfeld
'4966': Serge_Melac
'4967': Serge_Tchuruk
'4968': Sergei_Alexandrovitch_Ordzhonikidze
'4969': Sergei_Ivanov
'4970': Sergei_Yastrzhembsky
'4971': Sergei_Yushenkov
'4972': Sergey_Lavrov
'4973': Sergio_Castellitto
'4974': Sergio_Garcia
'4975': Sergio_Vieira_De_Mello
'4976': Seth_Gorney
'4977': Severino_Antinori
'4978': Seydou_Diarra
'4979': Seymour_Cassell
'4980': Shae-Lynn_Bourne
'4981': Shafal_Mosed
'4982': Shamai_Leibowitz
'4983': Shane_Hmiel
'4984': Shane_Loux
'4985': Shane_Mosley
'4986': Shane_Phillips
'4987': Shane_Reynolds
'4988': Shane_Warne
'4989': Shania_Twain
'4990': Shanna_Zolman
'4991': Shannon_OBrien
'4992': Shannyn_Sossamon
'4993': Sharess_Harrell
'4994': Sharon_Davis
'4995': Sharon_Frey
'4996': Sharon_Osbourne
'4997': Sharon_Robinson
'4998': Sharon_Stone
'4999': Shaukat_Aziz
'5000': Shaul_Mofaz
'5001': Shaun_Pollock
'5002': Shaun_Rusling
'5003': Shavon_Earp
'5004': Shawn_Bradley
'5005': Shawn_Kemp
'5006': Shawn_Marion
'5007': Sheikh_Ahmed_Yassin
'5008': Sheila_Copps
'5009': Sheila_Fraser
'5010': Sheila_Taormina
'5011': Sheila_Wellstone
'5012': Sheldon_Silver
'5013': Sherri_Coale
'5014': Sherry_Fisher
'5015': Sherry_Irving
'5016': Sheryl_Crow
'5017': Shi_Guangsheng
'5018': Shia_LaBeouf
'5019': Shigeo_Nagashima
'5020': Shigeru_Ishiba
'5021': Shimon_Peres
'5022': Shingo_Katayama
'5023': Shingo_Suetsugu
'5024': Shinya_Taniguchi
'5025': Shinzo_Abe
'5026': Shireen_Amir_Begum
'5027': Shirley_Jones
'5028': Shobha_De
'5029': Shoshana_Johnson
'5030': Shoshannah_Stern
'5031': Sid_Caesar
'5032': Sidney_Kimmel
'5033': Sidney_Poitier
'5034': Sigourney_Weaver
'5035': Sila_Calderon
'5036': Silvan_Shalom
'5037': Silvia_Farina_Elia
'5038': Silvie_Cabero
'5039': Silvio_Berlusconi
'5040': Silvio_Fernandez
'5041': Sim_Yong
'5042': Simon_Chalk
'5043': Simon_Cowell
'5044': Simon_Larose
'5045': Simon_Yam
'5046': Simona_Hradil
'5047': Sinead_OConnor
'5048': Sivan_Klein
'5049': Skip_Prosser
'5050': Slobodan_Milosevic
'5051': Soenarno
'5052': Sofia_Milos
'5053': Sofyan_Dawood
'5054': Sohail_Abbas
'5055': Sok_An
'5056': Solomon_Passy
'5057': Sonia_Gandhi
'5058': Sonia_Lopez
'5059': Sonja_Kesselschlager
'5060': Sonya_Walger
'5061': Soon_Yi
'5062': Sophia_Loren
'5063': Sophie
'5064': Sourav_Ganguly
'5065': Spencer_Abraham
'5066': Spike_Helmick
'5067': Spike_Jonze
'5068': Spike_Lee
'5069': Stacey_Dales-Schuman
'5070': Stacey_Jones
'5071': Stacey_Yamaguchi
'5072': Stacy_Dragila
'5073': Stacy_Nelson
'5074': Stan_Heath
'5075': Stan_Kasten
'5076': Stan_Kroenke
'5077': Stanislas_Wawrinka
'5078': Stanley_Ho
'5079': Stanley_McChrystal
'5080': Stanley_Nelson
'5081': Stanley_Tong
'5082': Stefaan_Declerk
'5083': Stefan_Holm
'5084': Stefan_Koubek
'5085': Stefan_Tafrov
'5086': Stefanie_De_Roux
'5087': Stefano_Accorsi
'5088': Stefano_Basalini
'5089': Stefano_Gabbana
'5090': Steffeny_Holtz
'5091': Steffi_Graf
'5092': Stella_Keitel
'5093': Stella_McCartney
'5094': Stella_Tennant
'5095': Stellan_Skarsgard
'5096': Steny_Hoyer
'5097': Stepan_Demirchian
'5098': Stephan_Eberharter
'5099': Stephane_Delajoux
'5100': Stephane_Rochon
'5101': Stephane_Rousseau
'5102': Stephanie_Cohen_Aloro
'5103': Stephanie_Moore
'5104': Stephanie_Zimbalist
'5105': Stephen_Ambrose
'5106': Stephen_Arigbabu
'5107': Stephen_Cooper
'5108': Stephen_Covey
'5109': Stephen_Crampton
'5110': Stephen_Daldry
'5111': Stephen_Ebberharter
'5112': Stephen_Frears
'5113': Stephen_Friedman
'5114': Stephen_Funk
'5115': Stephen_Glassroth
'5116': Stephen_Joseph
'5117': Stephen_Keener
'5118': Stephen_Oake
'5119': Stephen_Push
'5120': Stephen_Silas
'5121': Stephen_Swindal
'5122': Stephen_Thompson
'5123': Stephen_Webster
'5124': Sterling_Hitchcock
'5125': Steve-O
'5126': Steve_Alford
'5127': Steve_Allan
'5128': Steve_Allee
'5129': Steve_Austin
'5130': Steve_Avery
'5131': Steve_Backley
'5132': Steve_Ballmer
'5133': Steve_Blake
'5134': Steve_Blankenship
'5135': Steve_Case
'5136': Steve_Coogan
'5137': Steve_Coterill
'5138': Steve_Cox
'5139': Steve_Cutler
'5140': Steve_Fehr
'5141': Steve_Karsay
'5142': Steve_Kerr
'5143': Steve_Largent
'5144': Steve_Lavin
'5145': Steve_Lenard
'5146': Steve_Mariucci
'5147': Steve_McManaman
'5148': Steve_Nash
'5149': Steve_Nesbitt
'5150': Steve_Pagliuca
'5151': Steve_Park
'5152': Steve_Patterson
'5153': Steve_Peace
'5154': Steve_Phillips
'5155': Steve_Redgrave
'5156': Steve_Rush
'5157': Steve_Shiver
'5158': Steve_Spurrier
'5159': Steve_Stirling
'5160': Steve_Valentine
'5161': Steve_Wariner
'5162': Steve_Waugh
'5163': Steve_Zahn
'5164': Steven_Briggs
'5165': Steven_Craig
'5166': Steven_Curtis_Chapman
'5167': Steven_Feldman
'5168': Steven_Hatfill
'5169': Steven_Kinlock
'5170': Steven_Seagal
'5171': Steven_Spielberg
'5172': Steven_Tyler
'5173': Steven_Van_Zandt
'5174': Stipe_Mesic
'5175': Stockard_Channing
'5176': Strom_Thurmond
'5177': Stuart_Knoll
'5178': Stuart_Townsend
'5179': Stuart_Whitman
'5180': Sue_Grafton
'5181': Sue_Guevara
'5182': Sue_Johnston
'5183': Sue_Slavec
'5184': Sue_Wicks
'5185': Suh_Chung-won
'5186': Suh_Young-hoon
'5187': Suk_Chung_Hong
'5188': Sultan_Qaboos
'5189': Sun_Myung_Moon
'5190': Sung_Hong_Choi
'5191': Supachai_Panitchpakdi
'5192': Surakait_Sathirathai
'5193': Sureyya_Ayhan
'5194': Surya_Bahadur_Thapa
'5195': Susan_Collins
'5196': Susan_Sarandon
'5197': Susan_Walvius
'5198': Susan_Whelan
'5199': Sushma_Swaraj
'5200': Susie_Castillo
'5201': Susilo_Bambang_Yudhoyono
'5202': Suzanne_Fox
'5203': Suzanne_Gaudet
'5204': Suzanne_Haik_Terrell
'5205': Suzanne_Mubarak
'5206': Suzanne_Somers
'5207': Suzanne_Torrance
'5208': Suzie_McConnell_Serio
'5209': Sven_Goran_Eriksson
'5210': Sven_Ottke
'5211': Svend_Aage_Jensby
'5212': Svend_Robinson
'5213': Svetislav_Pesic
'5214': Svetlana_Belousova
'5215': Svetlana_Koroleva
'5216': Svetoslav_Todorov
'5217': Sybille_Schmid
'5218': Syed_Abdul_Rahman_Geelani
'5219': Syed_Ibrahim
'5220': Sylvester_Stallone
'5221': Sylvia_Plachy
'5222': Sylvie_Guillem
'5223': Szu_Yu_Chen
'5224': TA_McLendon
'5225': TJ_Ford
'5226': T_Boone_Pickens
'5227': Tab_Baldwin
'5228': Tab_Turner
'5229': Tabare_Vazquez
'5230': Taha_Yassin_Ramadan
'5231': Taia_Balk
'5232': Takahiro_Mori
'5233': Takaloo
'5234': Takashi_Sorimachi
'5235': Takashi_Yamamoto
'5236': Takenori_Kanzaki
'5237': Takeo_Fukui
'5238': Takeo_Hiranuma
'5239': Takeshi_Kitano
'5240': Taku_Yamasaki
'5241': Takuma_Sato
'5242': Talal_Keenaan
'5243': Tali_Imani
'5244': Talisa_Bratt
'5245': Talisa_Soto
'5246': Tamara_Brooks
'5247': Tamara_Mowry
'5248': Tamara_Stokes
'5249': Tamika_Catchings
'5250': Tammy_Helm
'5251': Tammy_Lynn_Michaels
'5252': Tang_Jiaxuan
'5253': Tangra_Riggle
'5254': Tanya_Holyk
'5255': Tanya_Lindenmuth
'5256': Taoufik_Mathlouthi
'5257': Tara_Dawn_Christensen
'5258': Tara_Kirk
'5259': Tara_Reid
'5260': Tara_VanDerveer
'5261': Tariq_Aziz
'5262': Tassos_Papadopoulos
'5263': Tatiana_Gratcheva
'5264': Tatiana_Kennedy_Schlossberg
'5265': Tatiana_Panova
'5266': Tatiana_Paus
'5267': Tatiana_Shchegoleva
'5268': Tatjana_Gsell
'5269': Tatsuya_Fuji
'5270': Tatyana_Tomashova
'5271': Taufik_Hidayat
'5272': Taufik_Kiemas
'5273': Tavis_Smiley
'5274': Taylor_Twellman
'5275': Taylyn_Solomon
'5276': Tayshaun_Prince
'5277': Tayyeb_Abdel_Rahim
'5278': Ted_Christopher
'5279': Ted_Costa
'5280': Ted_Maher
'5281': Ted_Nolan
'5282': Ted_Turner
'5283': Ted_Washington
'5284': Ted_Williams
'5285': Teddy_Kollek
'5286': Terence_Newman
'5287': Teresa_Graves
'5288': Teresa_Heinz_Kerry
'5289': Teresa_Williams
'5290': Teresa_Worbis
'5291': Teri_Files
'5292': Teri_Garr
'5293': Teri_ORourke
'5294': Terje_Roed-Larsen
'5295': Terrell_Suggs
'5296': Terrence_Kiel
'5297': Terrence_Trammell
'5298': Terri_Clark
'5299': Terry_Bradshaw
'5300': Terry_Gilliam
'5301': Terry_Hoeppner
'5302': Terry_Lynn_Barton
'5303': Terry_McAuliffe
'5304': Terry_Semel
'5305': Terry_Stotts
'5306': Teruaki_Masumoto
'5307': Terunobu_Maeda
'5308': Tessa_Jowell
'5309': Tex_Ritter
'5310': Thabo_Mbeki
'5311': Thad_Matta
'5312': Thaksin_Shinawatra
'5313': Thalia
'5314': Thanongsak_Tuvinan
'5315': Theo_Angelopoulos
'5316': Theo_Epstein
'5317': Theodore_Tweed_Roosevelt
'5318': Theresa_Gattung
'5319': Theresa_May
'5320': Thierry_Falise
'5321': Thierry_Mariani
'5322': Thomas_Birmingham
'5323': Thomas_Bjorn
'5324': Thomas_Cloyd
'5325': Thomas_Daily
'5326': Thomas_Day
'5327': Thomas_Enqvist
'5328': Thomas_Fargo
'5329': Thomas_Ferguson
'5330': Thomas_Franklin
'5331': Thomas_Gottschalk
'5332': Thomas_Haeggstroem
'5333': Thomas_Kelly
'5334': Thomas_Klestil
'5335': Thomas_Malchow
'5336': Thomas_Manger
'5337': Thomas_Mesereau_Jr
'5338': Thomas_OBrien
'5339': Thomas_Rupprath
'5340': Thomas_Scavone
'5341': Thomas_Stewart
'5342': Thomas_Ulrich
'5343': Thomas_Van_Essen
'5344': Thomas_Watjen
'5345': Thomas_Weston
'5346': Thomas_Wilkens
'5347': Thomas_Wyman
'5348': Thor_Pedersen
'5349': Tia_Mowry
'5350': Tiago_Splitter
'5351': Tian_Liang
'5352': Tian_Zhuang_Zhuang
'5353': Tiffany_Limos
'5354': Tiger_Woods
'5355': Tim_Allen
'5356': Tim_Blake_Nelson
'5357': Tim_Chapman
'5358': Tim_Conway
'5359': Tim_Curley
'5360': Tim_Curry
'5361': Tim_Duncan
'5362': Tim_Floyd
'5363': Tim_Henman
'5364': Tim_Howard
'5365': Tim_Jones
'5366': Tim_Lobinger
'5367': Tim_Lopes
'5368': Tim_Matheson
'5369': Tim_Norbeck
'5370': Tim_Pawlenty
'5371': Tim_Robbins
'5372': Tim_Salmon
'5373': Tim_Welsh
'5374': Timbul_Silaen
'5375': Timothy_Coughlin
'5376': Timothy_Goebel
'5377': Timothy_McVeigh
'5378': Timothy_Rigas
'5379': Timothy_Wirth
'5380': Tina_Andrews
'5381': Tina_Brown
'5382': Tina_Conner
'5383': Tina_Fey
'5384': Tina_Pisnik
'5385': Tina_Sinatra
'5386': Tino_Martinez
'5387': Tippi_Hedren
'5388': Tirunesh_Dibaba
'5389': Toby_Keith
'5390': Tocker_Pudwill
'5391': Todd_Haynes
'5392': Todd_MacCulloch
'5393': Todd_Parrott
'5394': Todd_Petit
'5395': Todd_Reid
'5396': Todd_Robbins
'5397': Todd_Wike
'5398': Tom_Amstutz
'5399': Tom_Brady
'5400': Tom_Brennan
'5401': Tom_Christerson
'5402': Tom_Coughlin
'5403': Tom_Coverdale
'5404': Tom_Craddick
'5405': Tom_Crean
'5406': Tom_Cruise
'5407': Tom_Curley
'5408': Tom_Daschle
'5409': Tom_DeLay
'5410': Tom_Foy
'5411': Tom_Gamboa
'5412': Tom_Glavine
'5413': Tom_Hanks
'5414': Tom_Hanusik
'5415': Tom_Harkin
'5416': Tom_Izzo
'5417': Tom_Jones
'5418': Tom_Kelly
'5419': Tom_Koenigs
'5420': Tom_Lantos
'5421': Tom_McClintock
'5422': Tom_Miller
'5423': Tom_Moss
'5424': Tom_OBrien
'5425': Tom_Osborne
'5426': Tom_Poston
'5427': Tom_Reilly
'5428': Tom_Ridge
'5429': Tom_Rouen
'5430': Tom_Schnackenberg
'5431': Tom_Scully
'5432': Tom_Sizemore
'5433': Tom_Smothers
'5434': Tom_Tunney
'5435': Tom_Vilsack
'5436': Tom_Watson
'5437': Tom_Welch
'5438': Tomas_Enge
'5439': Tomas_Malik
'5440': Tommy_Amaker
'5441': Tommy_Franks
'5442': Tommy_Haas
'5443': Tommy_Lasorda
'5444': Tommy_Lewis
'5445': Tommy_Maddox
'5446': Tommy_Robredo
'5447': Tommy_Shane_Steiner
'5448': Tommy_Thompson
'5449': Tommy_Tubberville
'5450': Tomoko_Hagiwara
'5451': Tomomi_Morita
'5452': Tonga
'5453': Toni_Braxton
'5454': Toni_Jennings
'5455': Tonino_Guerra
'5456': Tono_Suratman
'5457': Tony_Bennett
'5458': Tony_Blair
'5459': Tony_Clement
'5460': Tony_Cummo
'5461': Tony_Curtis
'5462': Tony_Elias
'5463': Tony_Fernandes
'5464': Tony_LaRussa
'5465': Tony_Parker
'5466': Tony_Shalhoub
'5467': Tony_Stewart
'5468': Tonya_Payne
'5469': Tora_Takagi
'5470': Tori_Amos
'5471': Torri_Edwards
'5472': Toshi_Izawa
'5473': Toshihiko_Fukui
'5474': Toshimitsu_Motegi
'5475': Toutai_Kefu
'5476': Tracee_Ellis_Ross
'5477': Tracee_Treadwell
'5478': Tracy_McGrady
'5479': Tracy_Wyle
'5480': Travis_Rudolph
'5481': Trent_Lott
'5482': Trevor_McDonald
'5483': Trevor_Watson
'5484': Trisha_Meili
'5485': Trista_Rehn
'5486': Tristan_Gretzky
'5487': Troy_Aikman
'5488': Troy_Garity
'5489': Troy_Hudson
'5490': Troy_Jenkins
'5491': Troy_Polamalu
'5492': Trudi_Lacey
'5493': Tsutomu_Takebe
'5494': Tubby_Smith
'5495': Tuncay_Sanli
'5496': Tung_Chee-hwa
'5497': Turner_Gill
'5498': Turner_Stevenson
'5499': Ty_Votaw
'5500': Tyler_Grillo
'5501': Tyler_Hamilton
'5502': Tyra_Banks
'5503': Tyron_Garner
'5504': Tyrone_Medley
'5505': Tzipora_Obziler
'5506': Uday_Hussein
'5507': Ulrich_Kueperkoch
'5508': Uma_Thurman
'5509': Uri_Lopolianski
'5510': Urmila_Matondkar
'5511': Uthai_Pimchaichon
'5512': Uzi_Even
'5513': Uzi_Landau
'5514': Vaclav_Havel
'5515': Vaclav_Klaus
'5516': Vadim_Devyatovskiy
'5517': Vadim_Strogalev
'5518': Vagit_Alekperov
'5519': Val_Ackerman
'5520': Valdas_Adamkus
'5521': Valentina_Cervi
'5522': Valentina_Tereshkova
'5523': Valentino_Rossi
'5524': Valeri_Bure
'5525': Valerie_Harper
'5526': Valerie_Thwaites
'5527': Valery_Giscard_dEstaing
'5528': Valorie_Brabazon
'5529': Van_Hilley
'5530': Vanessa_Incontrada
'5531': Vanessa_Laine
'5532': Vanessa_Redgrave
'5533': Vanessa_Williams
'5534': Vassilis_Xiros
'5535': Vecdi_Gonul
'5536': Venus_Williams
'5537': Vernon_Forrest
'5538': Veronica_Lake
'5539': Viara_Vike-Freiberga
'5540': Vicente_Fernandez
'5541': Vicente_Fox
'5542': Vicente_Fox_de_la_Concha
'5543': Vicki_Zhao_Wei
'5544': Victor_Garber
'5545': Victor_Hanescu
'5546': Victor_Kraatz
'5547': Victoria_Beckham
'5548': Victoria_Clarke
'5549': Vidar_Helgesen
'5550': Vijay_Nambiar
'5551': Viktor_Yushchenko
'5552': Vin_Diesel
'5553': Vince_Carter
'5554': Vince_Dooley
'5555': Vince_Gill
'5556': Vince_Vaughan
'5557': Vincent_Brooks
'5558': Vincent_Cianci_Jr
'5559': Vincent_Gallo
'5560': Vincent_Sombrotto
'5561': Vincent_Spadea
'5562': Vinnie_Jones
'5563': Viola_Davis
'5564': Virgina_Ruano_Pascal
'5565': Vitali_Klitschko
'5566': Vivica_Fox
'5567': Vladimir_Golovlyov
'5568': Vladimir_Meciar
'5569': Vladimir_Putin
'5570': Vladimir_Spidla
'5571': Vladimir_Ustinov
'5572': Vladimir_Voltchkov
'5573': Vladimiro_Montesinos
'5574': Vojislav_Kostunica
'5575': Vojislav_Seselj
'5576': Vyacheslav_Fetisov
'5577': Vytas_Danelius
'5578': Walid_Al-Awadi
'5579': Wallace_Capel
'5580': Wally_Szczerbiak
'5581': Walt_Harris
'5582': Walter_Annenberg
'5583': Walter_Mondale
'5584': Walter_Woods
'5585': Wan_Yanhai
'5586': Wanda_Ilene_Barzee
'5587': Wanda_de_la_Jesus
'5588': Wang_Fei
'5589': Wang_Hailan
'5590': Wang_Nan
'5591': Wang_Yi
'5592': Wang_Yingfan
'5593': Ward_Cuff
'5594': Warren_Beatty
'5595': Warren_Buffett
'5596': Warren_Granados
'5597': Warren_Truss
'5598': Wayne_Allard
'5599': Wayne_Brady
'5600': Wayne_Ferreira
'5601': Wayne_Gretzky
'5602': Wayne_Newton
'5603': Wei_Wu
'5604': Wen_Ho_Lee
'5605': Wen_Jiabao
'5606': Wendell_Bryant
'5607': Wendy_Kennedy
'5608': Wendy_Selig
'5609': Werner_Schlager
'5610': Wes_Craven
'5611': Wesley_Clark
'5612': Whoopi_Goldberg
'5613': Wilbert_Elki_Meza_Majino
'5614': Wilbert_Foy
'5615': Wilfredo_Moreno
'5616': Will_Ferrell
'5617': Will_Ofenheusle
'5618': Will_Self
'5619': Will_Smith
'5620': Will_Young
'5621': William_Bratton
'5622': William_Bulger
'5623': William_Burns
'5624': William_Cocksedge
'5625': William_Delahunt
'5626': William_Donaldson
'5627': William_Ford_Jr
'5628': William_Genego
'5629': William_Harrison
'5630': William_Hochul
'5631': William_Hurt
'5632': William_Hyde
'5633': William_Jackson
'5634': William_Joppy
'5635': William_Macy
'5636': William_Martin
'5637': William_McDonough
'5638': William_Morrow
'5639': William_Murabito
'5640': William_Nessen
'5641': William_Overlin
'5642': William_Perry
'5643': William_Pryor_Jr
'5644': William_Ragland
'5645': William_Rehnquist
'5646': William_Rosenberg
'5647': William_Shatner
'5648': William_Swor
'5649': William_Umbach
'5650': William_Webster
'5651': Willie_Nelson
'5652': Willie_Wilson
'5653': Willis_Roberts
'5654': Wilma_McNabb
'5655': Wilson_Alvarez
'5656': Wilton_Gregory
'5657': Wim_Duisenberg
'5658': Win_Aung
'5659': Winona_Ryder
'5660': Winston_Churchill
'5661': Wolfgang_Becker
'5662': Wolfgang_Clement
'5663': Wolfgang_Schneiderhan
'5664': Wolfgang_Schuessel
'5665': Wolfgang_Schwarz
'5666': Woodrow_Stanley
'5667': Woody_Allen
'5668': Wu_Peng
'5669': Wu_Yi
'5670': Wycliffe_Grousbeck
'5671': Xanana_Gusmao
'5672': Xavier_Malisse
'5673': Xiang_Huaicheng
'5674': Xiang_Liu
'5675': Xiang_Xu
'5676': Ximena_Bohorquez
'5677': Yale_Kamisar
'5678': Yana_Klochkova
'5679': Yang_Hee_Kim
'5680': Yang_Jianli
'5681': Yang_Pao-yu
'5682': Yann_Martel
'5683': Yannos_Papantoniou
'5684': Yao_Ming
'5685': Yasar_Yakis
'5686': Yasein_Taher
'5687': Yashwant_Sinha
'5688': Yasser_Arafat
'5689': Yasushi_Akashi
'5690': Yasushi_Chimura
'5691': Yekaterina_Guseva
'5692': Yevgeny_Kafelnikov
'5693': Yingfan_Wang
'5694': Yishan_Zhang
'5695': Yoelbi_Quesada
'5696': Yogi_Berra
'5697': Yoko_Ono
'5698': Yolanda_King
'5699': Yoo_Jay-Kun
'5700': Yoon_Jeong_Cho
'5701': Yoon_Jin-Sik
'5702': Yoon_Won-Sik
'5703': Yoon_Young-kwan
'5704': Yoriko_Kawaguchi
'5705': Yory_Boy_Campas
'5706': Yoshiyuki_Kamei
'5707': Yossi_Beilin
'5708': Young_Kim
'5709': Yu_Shyi-kun
'5710': Yukiko_Okudo
'5711': Yukio_Hatoyama
'5712': Yuri_Fedotov
'5713': Yuri_Luzhkov
'5714': Yuri_Malenchenko
'5715': Yusaku_Miyazato
'5716': Yusuf_Misbac
'5717': Yuvraj_Singh
'5718': Yves_Brodeur
'5719': Zach_Parise
'5720': Zach_Pillar
'5721': Zach_Safrin
'5722': Zafarullah_Khan_Jamali
'5723': Zahir_Shah
'5724': Zaini_Abdullah
'5725': Zakia_Hakki
'5726': Zalmay_Khalilzad
'5727': Zara_Akhmadova
'5728': Zarai_Toledo
'5729': Zavad_Zarif
'5730': Zdravko_Mucic
'5731': Zeljko_Rebraca
'5732': Zelma_Novelo
'5733': Zeng_Qinghong
'5734': Zhang_Wenkang
'5735': Zhang_Yimou
'5736': Zhang_Ziyi
'5737': Zhong_Nanshan
'5738': Zhu_Rongji
'5739': Zico
'5740': Zinedine_Zidane
'5741': Ziwang_Xu
'5742': Zoe_Ball
'5743': Zoran_Djindjic
'5744': Zorica_Radovic
'5745': Zulfiqar_Ahmed
'5746': Zumrati_Juma
'5747': Zurab_Tsereteli
'5748': Zydrunas_Ilgauskas
- name: image
dtype: image
splits:
- name: train
num_bytes: 190505484.194
num_examples: 13233
download_size: 188443388
dataset_size: 190505484.194
- config_name: pairs
features:
- name: pair
dtype:
class_label:
names:
'0': '0'
'1': '1'
- name: img_0
dtype: image
- name: img_1
dtype: image
splits:
- name: train
num_bytes: 28580331.0
num_examples: 1000
- name: test
num_bytes: 62912614.2
num_examples: 2200
download_size: 84352250
dataset_size: 91492945.2
configs:
- config_name: aug
data_files:
- split: train
path: aug/train-*
- config_name: default
data_files:
- split: train
path: data/train-*
- config_name: pairs
data_files:
- split: train
path: pairs/train-*
- split: test
path: pairs/test-*
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
TimoImhof/HotpotQA-in-SQuAD-format | ---
dataset_info:
features:
- name: id
dtype: string
- name: question
dtype: string
- name: context
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
splits:
- name: unmodified
num_bytes: 7657753
num_examples: 6113
- name: modified_30_percent
num_bytes: 7662336
num_examples: 6113
- name: modified_100_percent
num_bytes: 7673192
num_examples: 6113
download_size: 12541785
dataset_size: 22993281
---
# Dataset Card for "HotpotQA-in-SQuAD-format"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
argilla/agnews_weak_labeling | ---
language: en
dataset_info:
features:
- name: text
dtype: string
- name: inputs
struct:
- name: text
dtype: string
- name: prediction
dtype: 'null'
- name: prediction_agent
dtype: 'null'
- name: annotation
dtype: string
- name: annotation_agent
dtype: 'null'
- name: multi_label
dtype: bool
- name: explanation
dtype: 'null'
- name: id
dtype: 'null'
- name: metadata
struct:
- name: split
dtype: string
- name: status
dtype: string
- name: event_timestamp
dtype: 'null'
- name: metrics
dtype: 'null'
- name: vectors
struct:
- name: mini-lm-sentence-transformers
sequence: float64
splits:
- name: train
num_bytes: 25212139
num_examples: 7000
download_size: 20872343
dataset_size: 25212139
---
# Dataset Card for "agnews_weak_labeling"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Lajavaness/SICK-fr | ---
license: apache-2.0
---
|
saklee/qdqwdqw | ---
license: openrail
task_categories:
- text-classification
- text-generation
language:
- ae
- ar
tags:
- music
- not-for-all-audiences
size_categories:
- 100B<n<1T
--- |
open-llm-leaderboard/details_voidful__qd-phi-1_5 | ---
pretty_name: Evaluation run of voidful/qd-phi-1_5
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [voidful/qd-phi-1_5](https://huggingface.co/voidful/qd-phi-1_5) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_voidful__qd-phi-1_5\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-07T21:18:27.627362](https://huggingface.co/datasets/open-llm-leaderboard/details_voidful__qd-phi-1_5/blob/main/results_2024-04-07T21-18-27.627362.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3645232776260812,\n\
\ \"acc_stderr\": 0.033681797060402946,\n \"acc_norm\": 0.36780017371067014,\n\
\ \"acc_norm_stderr\": 0.03456200693061439,\n \"mc1\": 0.2937576499388005,\n\
\ \"mc1_stderr\": 0.015945068581236618,\n \"mc2\": 0.4422544970498814,\n\
\ \"mc2_stderr\": 0.015504265080594881\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.45563139931740615,\n \"acc_stderr\": 0.014553749939306868,\n\
\ \"acc_norm\": 0.4948805460750853,\n \"acc_norm_stderr\": 0.014610624890309157\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4644493128858793,\n\
\ \"acc_stderr\": 0.004977152746478585,\n \"acc_norm\": 0.6073491336387173,\n\
\ \"acc_norm_stderr\": 0.0048734218332915635\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3925925925925926,\n\
\ \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.3925925925925926,\n\
\ \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.03782728980865471,\n\
\ \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.03782728980865471\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.48,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.39622641509433965,\n \"acc_stderr\": 0.030102793781791194,\n\
\ \"acc_norm\": 0.39622641509433965,\n \"acc_norm_stderr\": 0.030102793781791194\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2986111111111111,\n\
\ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.2986111111111111,\n\
\ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3468208092485549,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.3468208092485549,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.04158307533083286,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.04158307533083286\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n\
\ \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3617021276595745,\n \"acc_stderr\": 0.03141082197596239,\n\
\ \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.03141082197596239\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n\
\ \"acc_stderr\": 0.044045561573747664,\n \"acc_norm\": 0.32456140350877194,\n\
\ \"acc_norm_stderr\": 0.044045561573747664\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.36551724137931035,\n \"acc_stderr\": 0.040131241954243856,\n\
\ \"acc_norm\": 0.36551724137931035,\n \"acc_norm_stderr\": 0.040131241954243856\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2751322751322751,\n \"acc_stderr\": 0.023000086859068652,\n \"\
acc_norm\": 0.2751322751322751,\n \"acc_norm_stderr\": 0.023000086859068652\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1984126984126984,\n\
\ \"acc_stderr\": 0.03567016675276865,\n \"acc_norm\": 0.1984126984126984,\n\
\ \"acc_norm_stderr\": 0.03567016675276865\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3741935483870968,\n\
\ \"acc_stderr\": 0.027528904299845783,\n \"acc_norm\": 0.3741935483870968,\n\
\ \"acc_norm_stderr\": 0.027528904299845783\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.03144712581678242,\n\
\ \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.03144712581678242\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.296969696969697,\n \"acc_stderr\": 0.03567969772268049,\n\
\ \"acc_norm\": 0.296969696969697,\n \"acc_norm_stderr\": 0.03567969772268049\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.45454545454545453,\n \"acc_stderr\": 0.03547601494006938,\n \"\
acc_norm\": 0.45454545454545453,\n \"acc_norm_stderr\": 0.03547601494006938\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.37823834196891193,\n \"acc_stderr\": 0.034998072761933376,\n\
\ \"acc_norm\": 0.37823834196891193,\n \"acc_norm_stderr\": 0.034998072761933376\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.023901157979402534,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.023901157979402534\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24444444444444444,\n \"acc_stderr\": 0.02620276653465215,\n \
\ \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.02620276653465215\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.031566630992154156,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.031566630992154156\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.25165562913907286,\n \"acc_stderr\": 0.035433042343899844,\n \"\
acc_norm\": 0.25165562913907286,\n \"acc_norm_stderr\": 0.035433042343899844\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.44403669724770645,\n \"acc_stderr\": 0.021302621211654525,\n \"\
acc_norm\": 0.44403669724770645,\n \"acc_norm_stderr\": 0.021302621211654525\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.16203703703703703,\n \"acc_stderr\": 0.02513045365226846,\n \"\
acc_norm\": 0.16203703703703703,\n \"acc_norm_stderr\": 0.02513045365226846\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.35784313725490197,\n \"acc_stderr\": 0.03364487286088299,\n \"\
acc_norm\": 0.35784313725490197,\n \"acc_norm_stderr\": 0.03364487286088299\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.3628691983122363,\n \"acc_stderr\": 0.031299208255302136,\n \
\ \"acc_norm\": 0.3628691983122363,\n \"acc_norm_stderr\": 0.031299208255302136\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.47085201793721976,\n\
\ \"acc_stderr\": 0.03350073248773404,\n \"acc_norm\": 0.47085201793721976,\n\
\ \"acc_norm_stderr\": 0.03350073248773404\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4122137404580153,\n \"acc_stderr\": 0.04317171194870255,\n\
\ \"acc_norm\": 0.4122137404580153,\n \"acc_norm_stderr\": 0.04317171194870255\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.4462809917355372,\n \"acc_stderr\": 0.0453793517794788,\n \"acc_norm\"\
: 0.4462809917355372,\n \"acc_norm_stderr\": 0.0453793517794788\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4351851851851852,\n\
\ \"acc_stderr\": 0.04792898170907062,\n \"acc_norm\": 0.4351851851851852,\n\
\ \"acc_norm_stderr\": 0.04792898170907062\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3619631901840491,\n \"acc_stderr\": 0.037757007291414416,\n\
\ \"acc_norm\": 0.3619631901840491,\n \"acc_norm_stderr\": 0.037757007291414416\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n\
\ \"acc_stderr\": 0.0457237235873743,\n \"acc_norm\": 0.36607142857142855,\n\
\ \"acc_norm_stderr\": 0.0457237235873743\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.46601941747572817,\n \"acc_stderr\": 0.0493929144727348,\n\
\ \"acc_norm\": 0.46601941747572817,\n \"acc_norm_stderr\": 0.0493929144727348\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5641025641025641,\n\
\ \"acc_stderr\": 0.032485775115784016,\n \"acc_norm\": 0.5641025641025641,\n\
\ \"acc_norm_stderr\": 0.032485775115784016\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4329501915708812,\n\
\ \"acc_stderr\": 0.017718469101513982,\n \"acc_norm\": 0.4329501915708812,\n\
\ \"acc_norm_stderr\": 0.017718469101513982\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4046242774566474,\n \"acc_stderr\": 0.026424816594009852,\n\
\ \"acc_norm\": 0.4046242774566474,\n \"acc_norm_stderr\": 0.026424816594009852\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25139664804469275,\n\
\ \"acc_stderr\": 0.014508979453553984,\n \"acc_norm\": 0.25139664804469275,\n\
\ \"acc_norm_stderr\": 0.014508979453553984\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.02827549015679143,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.02827549015679143\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3890675241157556,\n\
\ \"acc_stderr\": 0.027690337536485376,\n \"acc_norm\": 0.3890675241157556,\n\
\ \"acc_norm_stderr\": 0.027690337536485376\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.36728395061728397,\n \"acc_stderr\": 0.026822801759507894,\n\
\ \"acc_norm\": 0.36728395061728397,\n \"acc_norm_stderr\": 0.026822801759507894\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.29432624113475175,\n \"acc_stderr\": 0.0271871270115038,\n \
\ \"acc_norm\": 0.29432624113475175,\n \"acc_norm_stderr\": 0.0271871270115038\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2835723598435463,\n\
\ \"acc_stderr\": 0.011511900775968318,\n \"acc_norm\": 0.2835723598435463,\n\
\ \"acc_norm_stderr\": 0.011511900775968318\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.2867647058823529,\n \"acc_stderr\": 0.027472274473233818,\n\
\ \"acc_norm\": 0.2867647058823529,\n \"acc_norm_stderr\": 0.027472274473233818\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.32189542483660133,\n \"acc_stderr\": 0.01890101532209309,\n \
\ \"acc_norm\": 0.32189542483660133,\n \"acc_norm_stderr\": 0.01890101532209309\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4909090909090909,\n\
\ \"acc_stderr\": 0.04788339768702861,\n \"acc_norm\": 0.4909090909090909,\n\
\ \"acc_norm_stderr\": 0.04788339768702861\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4448979591836735,\n \"acc_stderr\": 0.031814251181977865,\n\
\ \"acc_norm\": 0.4448979591836735,\n \"acc_norm_stderr\": 0.031814251181977865\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.4925373134328358,\n\
\ \"acc_stderr\": 0.035351400842767194,\n \"acc_norm\": 0.4925373134328358,\n\
\ \"acc_norm_stderr\": 0.035351400842767194\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n\
\ \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n\
\ \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.36257309941520466,\n \"acc_stderr\": 0.0368713061556206,\n\
\ \"acc_norm\": 0.36257309941520466,\n \"acc_norm_stderr\": 0.0368713061556206\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2937576499388005,\n\
\ \"mc1_stderr\": 0.015945068581236618,\n \"mc2\": 0.4422544970498814,\n\
\ \"mc2_stderr\": 0.015504265080594881\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7087608524072613,\n \"acc_stderr\": 0.012769029305370692\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.009097801364670205,\n \
\ \"acc_stderr\": 0.0026153265107756703\n }\n}\n```"
repo_url: https://huggingface.co/voidful/qd-phi-1_5
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|arc:challenge|25_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|arc:challenge|25_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|gsm8k|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|gsm8k|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hellaswag|10_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hellaswag|10_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T08-27-11.490097.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T21-18-27.627362.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-07T21-18-27.627362.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- '**/details_harness|winogrande|5_2024-04-03T08-27-11.490097.parquet'
- split: 2024_04_07T21_18_27.627362
path:
- '**/details_harness|winogrande|5_2024-04-07T21-18-27.627362.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-07T21-18-27.627362.parquet'
- config_name: results
data_files:
- split: 2024_04_03T08_27_11.490097
path:
- results_2024-04-03T08-27-11.490097.parquet
- split: 2024_04_07T21_18_27.627362
path:
- results_2024-04-07T21-18-27.627362.parquet
- split: latest
path:
- results_2024-04-07T21-18-27.627362.parquet
---
# Dataset Card for Evaluation run of voidful/qd-phi-1_5
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [voidful/qd-phi-1_5](https://huggingface.co/voidful/qd-phi-1_5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_voidful__qd-phi-1_5",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-07T21:18:27.627362](https://huggingface.co/datasets/open-llm-leaderboard/details_voidful__qd-phi-1_5/blob/main/results_2024-04-07T21-18-27.627362.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3645232776260812,
"acc_stderr": 0.033681797060402946,
"acc_norm": 0.36780017371067014,
"acc_norm_stderr": 0.03456200693061439,
"mc1": 0.2937576499388005,
"mc1_stderr": 0.015945068581236618,
"mc2": 0.4422544970498814,
"mc2_stderr": 0.015504265080594881
},
"harness|arc:challenge|25": {
"acc": 0.45563139931740615,
"acc_stderr": 0.014553749939306868,
"acc_norm": 0.4948805460750853,
"acc_norm_stderr": 0.014610624890309157
},
"harness|hellaswag|10": {
"acc": 0.4644493128858793,
"acc_stderr": 0.004977152746478585,
"acc_norm": 0.6073491336387173,
"acc_norm_stderr": 0.0048734218332915635
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3925925925925926,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.3925925925925926,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.03782728980865471,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.03782728980865471
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.39622641509433965,
"acc_stderr": 0.030102793781791194,
"acc_norm": 0.39622641509433965,
"acc_norm_stderr": 0.030102793781791194
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2986111111111111,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.2986111111111111,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3468208092485549,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.3468208092485549,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.04158307533083286,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.04158307533083286
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3617021276595745,
"acc_stderr": 0.03141082197596239,
"acc_norm": 0.3617021276595745,
"acc_norm_stderr": 0.03141082197596239
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.044045561573747664,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.044045561573747664
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.36551724137931035,
"acc_stderr": 0.040131241954243856,
"acc_norm": 0.36551724137931035,
"acc_norm_stderr": 0.040131241954243856
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2751322751322751,
"acc_stderr": 0.023000086859068652,
"acc_norm": 0.2751322751322751,
"acc_norm_stderr": 0.023000086859068652
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1984126984126984,
"acc_stderr": 0.03567016675276865,
"acc_norm": 0.1984126984126984,
"acc_norm_stderr": 0.03567016675276865
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3741935483870968,
"acc_stderr": 0.027528904299845783,
"acc_norm": 0.3741935483870968,
"acc_norm_stderr": 0.027528904299845783
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.03144712581678242,
"acc_norm": 0.27586206896551724,
"acc_norm_stderr": 0.03144712581678242
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.296969696969697,
"acc_stderr": 0.03567969772268049,
"acc_norm": 0.296969696969697,
"acc_norm_stderr": 0.03567969772268049
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.45454545454545453,
"acc_stderr": 0.03547601494006938,
"acc_norm": 0.45454545454545453,
"acc_norm_stderr": 0.03547601494006938
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.37823834196891193,
"acc_stderr": 0.034998072761933376,
"acc_norm": 0.37823834196891193,
"acc_norm_stderr": 0.034998072761933376
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.02620276653465215,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.02620276653465215
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.031566630992154156,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.031566630992154156
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.25165562913907286,
"acc_stderr": 0.035433042343899844,
"acc_norm": 0.25165562913907286,
"acc_norm_stderr": 0.035433042343899844
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.44403669724770645,
"acc_stderr": 0.021302621211654525,
"acc_norm": 0.44403669724770645,
"acc_norm_stderr": 0.021302621211654525
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.16203703703703703,
"acc_stderr": 0.02513045365226846,
"acc_norm": 0.16203703703703703,
"acc_norm_stderr": 0.02513045365226846
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.35784313725490197,
"acc_stderr": 0.03364487286088299,
"acc_norm": 0.35784313725490197,
"acc_norm_stderr": 0.03364487286088299
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3628691983122363,
"acc_stderr": 0.031299208255302136,
"acc_norm": 0.3628691983122363,
"acc_norm_stderr": 0.031299208255302136
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.47085201793721976,
"acc_stderr": 0.03350073248773404,
"acc_norm": 0.47085201793721976,
"acc_norm_stderr": 0.03350073248773404
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4122137404580153,
"acc_stderr": 0.04317171194870255,
"acc_norm": 0.4122137404580153,
"acc_norm_stderr": 0.04317171194870255
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.4462809917355372,
"acc_stderr": 0.0453793517794788,
"acc_norm": 0.4462809917355372,
"acc_norm_stderr": 0.0453793517794788
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.04792898170907062,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.04792898170907062
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3619631901840491,
"acc_stderr": 0.037757007291414416,
"acc_norm": 0.3619631901840491,
"acc_norm_stderr": 0.037757007291414416
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.0457237235873743,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.0457237235873743
},
"harness|hendrycksTest-management|5": {
"acc": 0.46601941747572817,
"acc_stderr": 0.0493929144727348,
"acc_norm": 0.46601941747572817,
"acc_norm_stderr": 0.0493929144727348
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5641025641025641,
"acc_stderr": 0.032485775115784016,
"acc_norm": 0.5641025641025641,
"acc_norm_stderr": 0.032485775115784016
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.4329501915708812,
"acc_stderr": 0.017718469101513982,
"acc_norm": 0.4329501915708812,
"acc_norm_stderr": 0.017718469101513982
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4046242774566474,
"acc_stderr": 0.026424816594009852,
"acc_norm": 0.4046242774566474,
"acc_norm_stderr": 0.026424816594009852
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25139664804469275,
"acc_stderr": 0.014508979453553984,
"acc_norm": 0.25139664804469275,
"acc_norm_stderr": 0.014508979453553984
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.02827549015679143,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.02827549015679143
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3890675241157556,
"acc_stderr": 0.027690337536485376,
"acc_norm": 0.3890675241157556,
"acc_norm_stderr": 0.027690337536485376
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.36728395061728397,
"acc_stderr": 0.026822801759507894,
"acc_norm": 0.36728395061728397,
"acc_norm_stderr": 0.026822801759507894
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.29432624113475175,
"acc_stderr": 0.0271871270115038,
"acc_norm": 0.29432624113475175,
"acc_norm_stderr": 0.0271871270115038
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2835723598435463,
"acc_stderr": 0.011511900775968318,
"acc_norm": 0.2835723598435463,
"acc_norm_stderr": 0.011511900775968318
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.2867647058823529,
"acc_stderr": 0.027472274473233818,
"acc_norm": 0.2867647058823529,
"acc_norm_stderr": 0.027472274473233818
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.32189542483660133,
"acc_stderr": 0.01890101532209309,
"acc_norm": 0.32189542483660133,
"acc_norm_stderr": 0.01890101532209309
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4909090909090909,
"acc_stderr": 0.04788339768702861,
"acc_norm": 0.4909090909090909,
"acc_norm_stderr": 0.04788339768702861
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4448979591836735,
"acc_stderr": 0.031814251181977865,
"acc_norm": 0.4448979591836735,
"acc_norm_stderr": 0.031814251181977865
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.4925373134328358,
"acc_stderr": 0.035351400842767194,
"acc_norm": 0.4925373134328358,
"acc_norm_stderr": 0.035351400842767194
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42168674698795183,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.42168674698795183,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.36257309941520466,
"acc_stderr": 0.0368713061556206,
"acc_norm": 0.36257309941520466,
"acc_norm_stderr": 0.0368713061556206
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2937576499388005,
"mc1_stderr": 0.015945068581236618,
"mc2": 0.4422544970498814,
"mc2_stderr": 0.015504265080594881
},
"harness|winogrande|5": {
"acc": 0.7087608524072613,
"acc_stderr": 0.012769029305370692
},
"harness|gsm8k|5": {
"acc": 0.009097801364670205,
"acc_stderr": 0.0026153265107756703
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
pkyriakis/r | ---
license: openrail
---
|
Miosdream/vits2 | ---
license: openrail
---
|
profetize/kirsten_v4 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validate
path: data/validate-*
dataset_info:
features:
- name: Filename
dtype: string
- name: URL
dtype: string
- name: Content
dtype: string
splits:
- name: train
num_bytes: 64737198.97551546
num_examples: 2793
- name: test
num_bytes: 21602244.699312713
num_examples: 932
- name: validate
num_bytes: 21579066.32517182
num_examples: 931
download_size: 63041115
dataset_size: 107918510.0
---
# Dataset Card for "kirsten_v4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_TheBloke__Llama-2-7b-Chat-AWQ | ---
pretty_name: Evaluation run of TheBloke/Llama-2-7b-Chat-AWQ
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/Llama-2-7b-Chat-AWQ](https://huggingface.co/TheBloke/Llama-2-7b-Chat-AWQ)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__Llama-2-7b-Chat-AWQ\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T01:23:20.549960](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Llama-2-7b-Chat-AWQ/blob/main/results_2023-10-24T01-23-20.549960.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0,\n \"\
em_stderr\": 0.0,\n \"f1\": 0.0,\n \"f1_stderr\": 0.0,\n \"\
acc\": 0.23756906077348067,\n \"acc_stderr\": 0.007017551441813875\n },\n\
\ \"harness|drop|3\": {\n \"em\": 0.0,\n \"em_stderr\": 0.0,\n\
\ \"f1\": 0.0,\n \"f1_stderr\": 0.0\n },\n \"harness|gsm8k|5\"\
: {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.47513812154696133,\n \"acc_stderr\": 0.01403510288362775\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TheBloke/Llama-2-7b-Chat-AWQ
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|arc:challenge|25_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T01_23_20.549960
path:
- '**/details_harness|drop|3_2023-10-24T01-23-20.549960.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T01-23-20.549960.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T01_23_20.549960
path:
- '**/details_harness|gsm8k|5_2023-10-24T01-23-20.549960.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T01-23-20.549960.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hellaswag|10_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T01_23_20.549960
path:
- '**/details_harness|winogrande|5_2023-10-24T01-23-20.549960.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T01-23-20.549960.parquet'
- config_name: results
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- results_2023-10-03T10-54-21.847398.parquet
- split: 2023_10_24T01_23_20.549960
path:
- results_2023-10-24T01-23-20.549960.parquet
- split: latest
path:
- results_2023-10-24T01-23-20.549960.parquet
---
# Dataset Card for Evaluation run of TheBloke/Llama-2-7b-Chat-AWQ
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/Llama-2-7b-Chat-AWQ
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/Llama-2-7b-Chat-AWQ](https://huggingface.co/TheBloke/Llama-2-7b-Chat-AWQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__Llama-2-7b-Chat-AWQ",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T01:23:20.549960](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Llama-2-7b-Chat-AWQ/blob/main/results_2023-10-24T01-23-20.549960.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0,
"em_stderr": 0.0,
"f1": 0.0,
"f1_stderr": 0.0,
"acc": 0.23756906077348067,
"acc_stderr": 0.007017551441813875
},
"harness|drop|3": {
"em": 0.0,
"em_stderr": 0.0,
"f1": 0.0,
"f1_stderr": 0.0
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.47513812154696133,
"acc_stderr": 0.01403510288362775
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
BangumiBase/nana | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Nana
This is the image base of bangumi NANA, we detected 38 characters, 4462 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 102 | [Download](0/dataset.zip) | ![preview 1](0/preview_1.png) | ![preview 2](0/preview_2.png) | ![preview 3](0/preview_3.png) | ![preview 4](0/preview_4.png) | ![preview 5](0/preview_5.png) | ![preview 6](0/preview_6.png) | ![preview 7](0/preview_7.png) | ![preview 8](0/preview_8.png) |
| 1 | 885 | [Download](1/dataset.zip) | ![preview 1](1/preview_1.png) | ![preview 2](1/preview_2.png) | ![preview 3](1/preview_3.png) | ![preview 4](1/preview_4.png) | ![preview 5](1/preview_5.png) | ![preview 6](1/preview_6.png) | ![preview 7](1/preview_7.png) | ![preview 8](1/preview_8.png) |
| 2 | 60 | [Download](2/dataset.zip) | ![preview 1](2/preview_1.png) | ![preview 2](2/preview_2.png) | ![preview 3](2/preview_3.png) | ![preview 4](2/preview_4.png) | ![preview 5](2/preview_5.png) | ![preview 6](2/preview_6.png) | ![preview 7](2/preview_7.png) | ![preview 8](2/preview_8.png) |
| 3 | 72 | [Download](3/dataset.zip) | ![preview 1](3/preview_1.png) | ![preview 2](3/preview_2.png) | ![preview 3](3/preview_3.png) | ![preview 4](3/preview_4.png) | ![preview 5](3/preview_5.png) | ![preview 6](3/preview_6.png) | ![preview 7](3/preview_7.png) | ![preview 8](3/preview_8.png) |
| 4 | 33 | [Download](4/dataset.zip) | ![preview 1](4/preview_1.png) | ![preview 2](4/preview_2.png) | ![preview 3](4/preview_3.png) | ![preview 4](4/preview_4.png) | ![preview 5](4/preview_5.png) | ![preview 6](4/preview_6.png) | ![preview 7](4/preview_7.png) | ![preview 8](4/preview_8.png) |
| 5 | 19 | [Download](5/dataset.zip) | ![preview 1](5/preview_1.png) | ![preview 2](5/preview_2.png) | ![preview 3](5/preview_3.png) | ![preview 4](5/preview_4.png) | ![preview 5](5/preview_5.png) | ![preview 6](5/preview_6.png) | ![preview 7](5/preview_7.png) | ![preview 8](5/preview_8.png) |
| 6 | 36 | [Download](6/dataset.zip) | ![preview 1](6/preview_1.png) | ![preview 2](6/preview_2.png) | ![preview 3](6/preview_3.png) | ![preview 4](6/preview_4.png) | ![preview 5](6/preview_5.png) | ![preview 6](6/preview_6.png) | ![preview 7](6/preview_7.png) | ![preview 8](6/preview_8.png) |
| 7 | 979 | [Download](7/dataset.zip) | ![preview 1](7/preview_1.png) | ![preview 2](7/preview_2.png) | ![preview 3](7/preview_3.png) | ![preview 4](7/preview_4.png) | ![preview 5](7/preview_5.png) | ![preview 6](7/preview_6.png) | ![preview 7](7/preview_7.png) | ![preview 8](7/preview_8.png) |
| 8 | 105 | [Download](8/dataset.zip) | ![preview 1](8/preview_1.png) | ![preview 2](8/preview_2.png) | ![preview 3](8/preview_3.png) | ![preview 4](8/preview_4.png) | ![preview 5](8/preview_5.png) | ![preview 6](8/preview_6.png) | ![preview 7](8/preview_7.png) | ![preview 8](8/preview_8.png) |
| 9 | 390 | [Download](9/dataset.zip) | ![preview 1](9/preview_1.png) | ![preview 2](9/preview_2.png) | ![preview 3](9/preview_3.png) | ![preview 4](9/preview_4.png) | ![preview 5](9/preview_5.png) | ![preview 6](9/preview_6.png) | ![preview 7](9/preview_7.png) | ![preview 8](9/preview_8.png) |
| 10 | 25 | [Download](10/dataset.zip) | ![preview 1](10/preview_1.png) | ![preview 2](10/preview_2.png) | ![preview 3](10/preview_3.png) | ![preview 4](10/preview_4.png) | ![preview 5](10/preview_5.png) | ![preview 6](10/preview_6.png) | ![preview 7](10/preview_7.png) | ![preview 8](10/preview_8.png) |
| 11 | 60 | [Download](11/dataset.zip) | ![preview 1](11/preview_1.png) | ![preview 2](11/preview_2.png) | ![preview 3](11/preview_3.png) | ![preview 4](11/preview_4.png) | ![preview 5](11/preview_5.png) | ![preview 6](11/preview_6.png) | ![preview 7](11/preview_7.png) | ![preview 8](11/preview_8.png) |
| 12 | 143 | [Download](12/dataset.zip) | ![preview 1](12/preview_1.png) | ![preview 2](12/preview_2.png) | ![preview 3](12/preview_3.png) | ![preview 4](12/preview_4.png) | ![preview 5](12/preview_5.png) | ![preview 6](12/preview_6.png) | ![preview 7](12/preview_7.png) | ![preview 8](12/preview_8.png) |
| 13 | 122 | [Download](13/dataset.zip) | ![preview 1](13/preview_1.png) | ![preview 2](13/preview_2.png) | ![preview 3](13/preview_3.png) | ![preview 4](13/preview_4.png) | ![preview 5](13/preview_5.png) | ![preview 6](13/preview_6.png) | ![preview 7](13/preview_7.png) | ![preview 8](13/preview_8.png) |
| 14 | 76 | [Download](14/dataset.zip) | ![preview 1](14/preview_1.png) | ![preview 2](14/preview_2.png) | ![preview 3](14/preview_3.png) | ![preview 4](14/preview_4.png) | ![preview 5](14/preview_5.png) | ![preview 6](14/preview_6.png) | ![preview 7](14/preview_7.png) | ![preview 8](14/preview_8.png) |
| 15 | 25 | [Download](15/dataset.zip) | ![preview 1](15/preview_1.png) | ![preview 2](15/preview_2.png) | ![preview 3](15/preview_3.png) | ![preview 4](15/preview_4.png) | ![preview 5](15/preview_5.png) | ![preview 6](15/preview_6.png) | ![preview 7](15/preview_7.png) | ![preview 8](15/preview_8.png) |
| 16 | 20 | [Download](16/dataset.zip) | ![preview 1](16/preview_1.png) | ![preview 2](16/preview_2.png) | ![preview 3](16/preview_3.png) | ![preview 4](16/preview_4.png) | ![preview 5](16/preview_5.png) | ![preview 6](16/preview_6.png) | ![preview 7](16/preview_7.png) | ![preview 8](16/preview_8.png) |
| 17 | 50 | [Download](17/dataset.zip) | ![preview 1](17/preview_1.png) | ![preview 2](17/preview_2.png) | ![preview 3](17/preview_3.png) | ![preview 4](17/preview_4.png) | ![preview 5](17/preview_5.png) | ![preview 6](17/preview_6.png) | ![preview 7](17/preview_7.png) | ![preview 8](17/preview_8.png) |
| 18 | 416 | [Download](18/dataset.zip) | ![preview 1](18/preview_1.png) | ![preview 2](18/preview_2.png) | ![preview 3](18/preview_3.png) | ![preview 4](18/preview_4.png) | ![preview 5](18/preview_5.png) | ![preview 6](18/preview_6.png) | ![preview 7](18/preview_7.png) | ![preview 8](18/preview_8.png) |
| 19 | 18 | [Download](19/dataset.zip) | ![preview 1](19/preview_1.png) | ![preview 2](19/preview_2.png) | ![preview 3](19/preview_3.png) | ![preview 4](19/preview_4.png) | ![preview 5](19/preview_5.png) | ![preview 6](19/preview_6.png) | ![preview 7](19/preview_7.png) | ![preview 8](19/preview_8.png) |
| 20 | 83 | [Download](20/dataset.zip) | ![preview 1](20/preview_1.png) | ![preview 2](20/preview_2.png) | ![preview 3](20/preview_3.png) | ![preview 4](20/preview_4.png) | ![preview 5](20/preview_5.png) | ![preview 6](20/preview_6.png) | ![preview 7](20/preview_7.png) | ![preview 8](20/preview_8.png) |
| 21 | 31 | [Download](21/dataset.zip) | ![preview 1](21/preview_1.png) | ![preview 2](21/preview_2.png) | ![preview 3](21/preview_3.png) | ![preview 4](21/preview_4.png) | ![preview 5](21/preview_5.png) | ![preview 6](21/preview_6.png) | ![preview 7](21/preview_7.png) | ![preview 8](21/preview_8.png) |
| 22 | 16 | [Download](22/dataset.zip) | ![preview 1](22/preview_1.png) | ![preview 2](22/preview_2.png) | ![preview 3](22/preview_3.png) | ![preview 4](22/preview_4.png) | ![preview 5](22/preview_5.png) | ![preview 6](22/preview_6.png) | ![preview 7](22/preview_7.png) | ![preview 8](22/preview_8.png) |
| 23 | 29 | [Download](23/dataset.zip) | ![preview 1](23/preview_1.png) | ![preview 2](23/preview_2.png) | ![preview 3](23/preview_3.png) | ![preview 4](23/preview_4.png) | ![preview 5](23/preview_5.png) | ![preview 6](23/preview_6.png) | ![preview 7](23/preview_7.png) | ![preview 8](23/preview_8.png) |
| 24 | 58 | [Download](24/dataset.zip) | ![preview 1](24/preview_1.png) | ![preview 2](24/preview_2.png) | ![preview 3](24/preview_3.png) | ![preview 4](24/preview_4.png) | ![preview 5](24/preview_5.png) | ![preview 6](24/preview_6.png) | ![preview 7](24/preview_7.png) | ![preview 8](24/preview_8.png) |
| 25 | 52 | [Download](25/dataset.zip) | ![preview 1](25/preview_1.png) | ![preview 2](25/preview_2.png) | ![preview 3](25/preview_3.png) | ![preview 4](25/preview_4.png) | ![preview 5](25/preview_5.png) | ![preview 6](25/preview_6.png) | ![preview 7](25/preview_7.png) | ![preview 8](25/preview_8.png) |
| 26 | 39 | [Download](26/dataset.zip) | ![preview 1](26/preview_1.png) | ![preview 2](26/preview_2.png) | ![preview 3](26/preview_3.png) | ![preview 4](26/preview_4.png) | ![preview 5](26/preview_5.png) | ![preview 6](26/preview_6.png) | ![preview 7](26/preview_7.png) | ![preview 8](26/preview_8.png) |
| 27 | 40 | [Download](27/dataset.zip) | ![preview 1](27/preview_1.png) | ![preview 2](27/preview_2.png) | ![preview 3](27/preview_3.png) | ![preview 4](27/preview_4.png) | ![preview 5](27/preview_5.png) | ![preview 6](27/preview_6.png) | ![preview 7](27/preview_7.png) | ![preview 8](27/preview_8.png) |
| 28 | 189 | [Download](28/dataset.zip) | ![preview 1](28/preview_1.png) | ![preview 2](28/preview_2.png) | ![preview 3](28/preview_3.png) | ![preview 4](28/preview_4.png) | ![preview 5](28/preview_5.png) | ![preview 6](28/preview_6.png) | ![preview 7](28/preview_7.png) | ![preview 8](28/preview_8.png) |
| 29 | 38 | [Download](29/dataset.zip) | ![preview 1](29/preview_1.png) | ![preview 2](29/preview_2.png) | ![preview 3](29/preview_3.png) | ![preview 4](29/preview_4.png) | ![preview 5](29/preview_5.png) | ![preview 6](29/preview_6.png) | ![preview 7](29/preview_7.png) | ![preview 8](29/preview_8.png) |
| 30 | 34 | [Download](30/dataset.zip) | ![preview 1](30/preview_1.png) | ![preview 2](30/preview_2.png) | ![preview 3](30/preview_3.png) | ![preview 4](30/preview_4.png) | ![preview 5](30/preview_5.png) | ![preview 6](30/preview_6.png) | ![preview 7](30/preview_7.png) | ![preview 8](30/preview_8.png) |
| 31 | 35 | [Download](31/dataset.zip) | ![preview 1](31/preview_1.png) | ![preview 2](31/preview_2.png) | ![preview 3](31/preview_3.png) | ![preview 4](31/preview_4.png) | ![preview 5](31/preview_5.png) | ![preview 6](31/preview_6.png) | ![preview 7](31/preview_7.png) | ![preview 8](31/preview_8.png) |
| 32 | 60 | [Download](32/dataset.zip) | ![preview 1](32/preview_1.png) | ![preview 2](32/preview_2.png) | ![preview 3](32/preview_3.png) | ![preview 4](32/preview_4.png) | ![preview 5](32/preview_5.png) | ![preview 6](32/preview_6.png) | ![preview 7](32/preview_7.png) | ![preview 8](32/preview_8.png) |
| 33 | 7 | [Download](33/dataset.zip) | ![preview 1](33/preview_1.png) | ![preview 2](33/preview_2.png) | ![preview 3](33/preview_3.png) | ![preview 4](33/preview_4.png) | ![preview 5](33/preview_5.png) | ![preview 6](33/preview_6.png) | ![preview 7](33/preview_7.png) | N/A |
| 34 | 18 | [Download](34/dataset.zip) | ![preview 1](34/preview_1.png) | ![preview 2](34/preview_2.png) | ![preview 3](34/preview_3.png) | ![preview 4](34/preview_4.png) | ![preview 5](34/preview_5.png) | ![preview 6](34/preview_6.png) | ![preview 7](34/preview_7.png) | ![preview 8](34/preview_8.png) |
| 35 | 13 | [Download](35/dataset.zip) | ![preview 1](35/preview_1.png) | ![preview 2](35/preview_2.png) | ![preview 3](35/preview_3.png) | ![preview 4](35/preview_4.png) | ![preview 5](35/preview_5.png) | ![preview 6](35/preview_6.png) | ![preview 7](35/preview_7.png) | ![preview 8](35/preview_8.png) |
| 36 | 6 | [Download](36/dataset.zip) | ![preview 1](36/preview_1.png) | ![preview 2](36/preview_2.png) | ![preview 3](36/preview_3.png) | ![preview 4](36/preview_4.png) | ![preview 5](36/preview_5.png) | ![preview 6](36/preview_6.png) | N/A | N/A |
| noise | 78 | [Download](-1/dataset.zip) | ![preview 1](-1/preview_1.png) | ![preview 2](-1/preview_2.png) | ![preview 3](-1/preview_3.png) | ![preview 4](-1/preview_4.png) | ![preview 5](-1/preview_5.png) | ![preview 6](-1/preview_6.png) | ![preview 7](-1/preview_7.png) | ![preview 8](-1/preview_8.png) |
|
winglian/evals | ---
task_categories:
- text-generation
- question-answering
language:
- en
size_categories:
- 1K<n<10K
---
# Instruct Augmented Datasets
This dataset takes various other multiple choice, summarization, etc datasets and augments them to be instruct finetuned. |
ovior/twitter_dataset_1712980317 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: string
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 790387
num_examples: 2593
download_size: 440572
dataset_size: 790387
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
bahjat-kawar/tedbench | ---
dataset_info:
features:
- name: original_image
dtype: image
- name: caption
dtype: string
- name: edited_image
dtype: image
splits:
- name: val
num_bytes: 142328486.0
num_examples: 100
download_size: 133432412
dataset_size: 142328486.0
configs:
- config_name: default
data_files:
- split: val
path: data/val-*
---
# Dataset Card for TEdBench
**TEdBench** (**T**extual **E**diting **Bench**mark) contains 100 images and text descriptions depicting edits to be applied.
It is intended as an evaluation dataset for the task of text-based real image editing.
This dataset was first introduced in [Imagic](https://imagic-editing.github.io/).
## Dataset Structure
Each entry in this dataset contains a real image (`original_image`), a text describing the desired edit (`caption`), and Imagic's result for comparison purposes (`edited_image`).
Note that Imagic results are of varied quality, and should not be considered as ground-truth.
## Citation (BibTeX)
If you find this dataset useful in your work, please cite:
```
@inproceedings{kawar2023imagic,
title={Imagic: Text-Based Real Image Editing with Diffusion Models},
author={Kawar, Bahjat and Zada, Shiran and Lang, Oran and Tov, Omer and Chang, Huiwen and Dekel, Tali and Mosseri, Inbar and Irani, Michal},
booktitle={Conference on Computer Vision and Pattern Recognition 2023},
year={2023}
}
```
|
enoahjr/twitter_dataset_1713223180 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 278261
num_examples: 786
download_size: 139675
dataset_size: 278261
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
korexyz/plato | ---
license: mit
dataset_info:
features:
- name: entry_id
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 186603801
num_examples: 2424
download_size: 91461530
dataset_size: 186603801
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
task_categories:
- text-generation
language:
- en
tags:
- philosophy
pretty_name: lato
size_categories:
- 1K<n<10K
---
# Plato: philosophy essays from plato.stanford.edu
Plato is a corpus of 2.4k high quality philosophy essays from [plato.stanford.edu](https://plato.stanford.edu). |
dkshjn/mixqa | ---
dataset_info:
features:
- name: question
dtype: string
- name: optionsKey
dtype: string
- name: prompt
dtype: string
- name: gold
dtype: string
splits:
- name: train
num_bytes: 1052037
num_examples: 1000
- name: test
num_bytes: 504473
num_examples: 500
download_size: 169625
dataset_size: 261568
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "mixqa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
shukawam/demo-dataset | ---
license: cc
---
|
forgeml/test | ---
dataset_info:
features:
- name: id
dtype: string
- name: image
dtype: image
- name: unsplash_query
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 67148083.0
num_examples: 5
download_size: 67080826
dataset_size: 67148083.0
---
# Dataset Card for "test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kheopss/concatenated_from_f1.0_to_f5.0 | ---
dataset_info:
features:
- name: text
dtype: string
- name: text2
dtype: string
splits:
- name: train
num_bytes: 18443699548
num_examples: 11797780
download_size: 3999564365
dataset_size: 18443699548
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
skrishna/toxicity_preprop | ---
license: mit
---
|
princeton-nlp/SWE-bench_Lite_oracle | ---
dataset_info:
features:
- name: instance_id
dtype: string
- name: text
dtype: string
- name: repo
dtype: string
- name: base_commit
dtype: string
- name: problem_statement
dtype: string
- name: hints_text
dtype: string
- name: created_at
dtype: string
- name: patch
dtype: string
- name: test_patch
dtype: string
- name: version
dtype: string
- name: FAIL_TO_PASS
dtype: string
- name: PASS_TO_PASS
dtype: string
- name: environment_setup_commit
dtype: string
splits:
- name: dev
num_bytes: 1439991
num_examples: 23
- name: test
num_bytes: 20853665
num_examples: 300
download_size: 9371677
dataset_size: 22293656
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
---
### Dataset Summary
SWE-bench is a dataset that tests systems’ ability to solve GitHub issues automatically. The dataset collects 300 test Issue-Pull Request pairs from 11 popular Python. Evaluation is performed by unit test verification using post-PR behavior as the reference solution.
The dataset was released as part of [SWE-bench: Can Language Models Resolve Real-World GitHub Issues?](https://arxiv.org/abs/2310.06770)
This dataset `SWE-bench_Lite_oracle` includes a formatting of each instance using the "Oracle" retrieval setting as described in the paper. The `text` column can be used directly with LMs to generate patch files.
Models are instructed to generate [`patch`](https://en.wikipedia.org/wiki/Patch_(Unix)) formatted file using the following template:
```diff
<patch>
diff
--- a/path/to/file.py
--- b/path/to/file.py
@@ -1,3 +1,3 @@
This is a test file.
-It contains several lines.
+It has been modified.
This is the third line.
</patch>
```
This format can be used directly with the [SWE-bench inference scripts](https://github.com/princeton-nlp/SWE-bench/tree/main/inference). Please refer to these scripts for more details on inference.
|
gweltou/wikipedia-br-20240325 | ---
license: apache-2.0
language:
- br
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
---
A corpus of sentences extracted for the Breton Wikipedia (cirrus dump).
The sentences were filtered so that only Breton sentences were kept.
Please note that the sentence splitting algorithm is far from perfect, so many sentences will appear incorrect or incomplete. |
c-s-ale/transactpro-dataset | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 7808
num_examples: 24
- name: test
num_bytes: 976
num_examples: 3
- name: valid
num_bytes: 976
num_examples: 3
download_size: 15787
dataset_size: 9760
license: openrail
task_categories:
- table-question-answering
language:
- en
pretty_name: TransactPro FAQ Dataset
size_categories:
- n<1K
---
# Dataset Card for TransactPro FAQ!
This is a synthetic dataset made with GPT-4. |
AdapterOcean/python-code-instructions-18k-alpaca-standardized_cluster_5_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1188178
num_examples: 3600
download_size: 496084
dataset_size: 1188178
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "python-code-instructions-18k-alpaca-standardized_cluster_5_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rahular/varta-urls | ---
license: cc
task_categories:
- summarization
- feature-extraction
language:
- as
- bh
- bn
- en
- gu
- hi
- kn
- ml
- mr
- ne
- or
- pa
- ta
- te
- ur
pretty_name: varta
size_categories:
- 1B<n<10B
---
## Dataset Description
- **Repository:** https://github.com/rahular/varta
- **Paper:** https://arxiv.org/abs/2305.05858
### Dataset Summary
Varta is a diverse, challenging, large-scale, multilingual, and high-quality headline-generation dataset containing 41.8 million news articles in 14 Indic languages and English.
The data is crawled from DailyHunt, a popular news aggregator in India that pulls high-quality articles from multiple trusted and reputed news publishers.
### Languages
Assamese, Bhojpuri, Bengali, English, Gujarati, Hindi, Kannada, Malayalam, Marathi, Nepali, Oriya, Punjabi, Tamil, Telugu, and Urdu.
## Dataset Structure
### Data Instances
```
{
"id":"n400000150",
"langCode":"as",
"source_url":"https://www.etvbharat.com/assamese/assam/bharat/militant-hideout-destroyed-on-srinagar-bandipora-highway/assam20220630074145729729173",
"dh_url":"https://m.dailyhunt.in/news/india/assamese/etvbharatassamese-epaper-dh6b381d65c3344bbcad9a06ee28b4ab2a/boma+nikshepeve+dhbans+kva+hl+santvasabadiv+aatmagopanasthali-newsid-n400000150"
}
```
### Data Fields
- id: unique identifier for the artilce on DailyHunt. This id will be used to recreate the dataset.
- langCode: ISO 639-1 language code
- source_url: the url that points to the article on the website of the original publisher
- dh_url: the url that points to the article on DailyHunt
### Data Splits
From every language, we randomly sample 10,000 articles each for validation and testing. We also ensure that at least 80% of a language’s data is available for training.
Therefore, if a language has less than 100,000 articles, we restrict its validation and test splits to 10% of its size.
We also create a `small` training set by limiting the number of articles from each language to 100K.
This `small` training set with a size of 1.3M is used in all our fine-tuning experiments.
You can find the `small` training set [here](https://huggingface.co/datasets/rahular/varta/blob/main/varta/train/train_100k.json)
## Data Recreation
To recreate the dataset, follow this [README file](https://github.com/rahular/varta/tree/main/crawler#README.md).
## Misc
- Original source: https://m.dailyhunt.in/
- License: CC-BY 4.0
## Citation Information
```
@misc{aralikatte2023varta,
title={V\=arta: A Large-Scale Headline-Generation Dataset for Indic Languages},
author={Rahul Aralikatte and Ziling Cheng and Sumanth Doddapaneni and Jackie Chi Kit Cheung},
year={2023},
eprint={2305.05858},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
ds4sd/PubTabNet_OTSL | ---
license: other
pretty_name: PubTabNet-OTSL
size_categories:
- 10K<n<100K
tags:
- table-structure-recognition
- table-understanding
- PDF
task_categories:
- object-detection
- table-to-text
---
# Dataset Card for PubTabNet_OTSL
## Dataset Description
- **Homepage:** https://ds4sd.github.io
- **Paper:** https://arxiv.org/pdf/2305.03393
### Dataset Summary
This dataset is a conversion of the original [PubTabNet](https://developer.ibm.com/exchanges/data/all/pubtabnet/) into the OTSL format presented in our paper "Optimized Table Tokenization for Table Structure Recognition". The dataset includes the original annotations amongst new additions.
### Dataset Structure
* cells: origunal dataset cell groundtruth (content).
* otsl: new reduced table structure token format
* html: original dataset groundtruth HTML (structure).
* html_restored: generated HTML from OTSL.
* cols: grid column length.
* rows: grid row length.
* image: PIL image
### OTSL Vocabulary:
**OTSL**: new reduced table structure token format
More information on the OTSL table structure format and its concepts can be read from our paper.
Format of this dataset extends work presented in a paper, and introduces slight modifications:
* "fcel" - cell that has content in it
* "ecel" - cell that is empty
* "lcel" - left-looking cell (to handle horizontally merged cells)
* "ucel" - up-looking cell (to handle vertically merged cells)
* "xcel" - 2d span cells, in this dataset - covers entire area of a merged cell
* "nl" - new line token
### Data Splits
The dataset provides three splits
- `train`
- `val`
## Additional Information
### Dataset Curators
The dataset is converted by the [Deep Search team](https://ds4sd.github.io/) at IBM Research.
You can contact us at [deepsearch-core@zurich.ibm.com](mailto:deepsearch-core@zurich.ibm.com).
Curators:
- Maksym Lysak, [@maxmnemonic](https://github.com/maxmnemonic)
- Ahmed Nassar, [@nassarofficial](https://github.com/nassarofficial)
- Christoph Auer, [@cau-git](https://github.com/cau-git)
- Nikos Livathinos, [@nikos-livathinos](https://github.com/nikos-livathinos)
- Peter Staar, [@PeterStaar-IBM](https://github.com/PeterStaar-IBM)
### Citation Information
```bib
@misc{lysak2023optimized,
title={Optimized Table Tokenization for Table Structure Recognition},
author={Maksym Lysak and Ahmed Nassar and Nikolaos Livathinos and Christoph Auer and Peter Staar},
year={2023},
eprint={2305.03393},
archivePrefix={arXiv},
primaryClass={cs.CV}
}```
|
CyberHarem/xianyun_genshin | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of xianyun/閑雲/闲云 (Genshin Impact)
This is the dataset of xianyun/閑雲/闲云 (Genshin Impact), containing 313 images and their tags.
The core tags of this character are `long_hair, multicolored_hair, black_hair, green_hair, two-tone_hair, glasses, colored_inner_hair, red-framed_eyewear, breasts, hair_ornament, very_long_hair, semi-rimless_eyewear, aqua_eyes, large_breasts, tassel, earrings, tassel_earrings, aqua_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 313 | 668.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/xianyun_genshin/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 313 | 555.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/xianyun_genshin/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 798 | 1.04 GiB | [Download](https://huggingface.co/datasets/CyberHarem/xianyun_genshin/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/xianyun_genshin',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, jewelry, solo, looking_at_viewer, simple_background, upper_body, white_background, makeup, gloves |
| 1 | 7 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, gloves, jewelry, long_sleeves, looking_at_viewer, solo, makeup, dress, smile, bodystocking, upper_body |
| 2 | 8 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, bare_back, gloves, solo, from_behind, looking_at_viewer, ass, looking_back, backless_dress, bare_shoulders, jewelry, ponytail, thighs, white_background |
| 3 | 7 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, alternate_costume, black_skirt, collared_shirt, looking_at_viewer, solo, white_shirt, long_sleeves, office_lady, jewelry, miniskirt, pencil_skirt, contemporary, pantyhose, thighs |
| 4 | 5 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1boy, blush, hetero, mosaic_censoring, solo_focus, 1girl, completely_nude, cum_in_pussy, nipples, open_mouth, vaginal, looking_at_viewer, anus, ass, collarbone, disembodied_penis, gloves, green_eyes, heart, jewelry, looking_back, pillow, pov, sex_from_behind, spread_legs, sweat, thighs, tongue_out |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | jewelry | solo | looking_at_viewer | simple_background | upper_body | white_background | makeup | gloves | long_sleeves | dress | smile | bodystocking | bare_back | from_behind | ass | looking_back | backless_dress | bare_shoulders | ponytail | thighs | alternate_costume | black_skirt | collared_shirt | white_shirt | office_lady | miniskirt | pencil_skirt | contemporary | pantyhose | 1boy | blush | hetero | mosaic_censoring | solo_focus | completely_nude | cum_in_pussy | nipples | open_mouth | vaginal | anus | collarbone | disembodied_penis | green_eyes | heart | pillow | pov | sex_from_behind | spread_legs | sweat | tongue_out |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------|:-------|:--------------------|:--------------------|:-------------|:-------------------|:---------|:---------|:---------------|:--------|:--------|:---------------|:------------|:--------------|:------|:---------------|:-----------------|:-----------------|:-----------|:---------|:--------------------|:--------------|:-----------------|:--------------|:--------------|:------------|:---------------|:---------------|:------------|:-------|:--------|:---------|:-------------------|:-------------|:------------------|:---------------|:----------|:-------------|:----------|:-------|:-------------|:--------------------|:-------------|:--------|:---------|:------|:------------------|:--------------|:--------|:-------------|
| 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | X | X | | | X | | X | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | X | X | | | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | X | | X | | | | | X | | | | | | | X | X | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
stanmalkinson199/CraigTuckerPTBR | ---
license: openrail
---
|
faterazer/LOL-Arts | ---
task_categories:
- image-to-image
language:
- zh
---
这是一个「英雄联盟」原画的图片数据集,旨在为「英雄联盟」原画风格的图片生成和风格迁移提供训练数据。本数据集中的图片均为高分辨率的「英雄联盟」原画,图片尺寸全部大于 1920 * 1080。 |
arieg/bw_spec_cls_80_07 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '16158'
'1': '16162'
'2': '16163'
'3': '16334'
'4': '16354'
'5': '16743'
'6': '16744'
'7': '16745'
'8': '16747'
'9': '16819'
'10': '16820'
'11': '16821'
'12': '16822'
'13': '16878'
'14': '16879'
'15': '16880'
'16': '17132'
'17': '17462'
'18': '17491'
'19': '17496'
'20': '17499'
'21': '17500'
'22': '17573'
'23': '17588'
'24': '17605'
'25': '17606'
'26': '17607'
'27': '17608'
'28': '17609'
'29': '17610'
'30': '17611'
'31': '17631'
'32': '17632'
'33': '17633'
'34': '17634'
'35': '17635'
'36': '17636'
'37': '17637'
'38': '17644'
'39': '17735'
'40': '17782'
'41': '17884'
'42': '17906'
'43': '18031'
'44': '18032'
'45': '18033'
'46': '18034'
'47': '18043'
'48': '18044'
'49': '18124'
'50': '18144'
'51': '18145'
'52': '18146'
'53': '18159'
'54': '18197'
'55': '18607'
'56': '18611'
'57': '18876'
'58': '18877'
'59': '18887'
'60': '19073'
'61': '19074'
'62': '19179'
'63': '19184'
'64': '19187'
'65': '19192'
'66': '19412'
'67': '19413'
'68': '19415'
'69': '19416'
'70': '19417'
'71': '19418'
'72': '19420'
'73': '19422'
'74': '19423'
'75': '19425'
'76': '19438'
'77': '19441'
'78': '19442'
'79': '19459'
splits:
- name: train
num_bytes: 90744057.6
num_examples: 1600
download_size: 89863005
dataset_size: 90744057.6
---
# Dataset Card for "bw_spec_cls_80_07"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
trongnghia/product_matching_2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 118717950
num_examples: 974305
- name: val
num_bytes: 14840424
num_examples: 121788
- name: test
num_bytes: 14836284
num_examples: 121789
download_size: 60856940
dataset_size: 148394658
---
# Dataset Card for "product_matching_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
irds/wikir_en1k | ---
pretty_name: '`wikir/en1k`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `wikir/en1k`
The `wikir/en1k` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/wikir#wikir/en1k).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=369,721
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/wikir_en1k', 'docs')
for record in docs:
record # {'doc_id': ..., 'text': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{Frej2020Wikir,
title={WIKIR: A Python toolkit for building a large-scale Wikipedia-based English Information Retrieval Dataset},
author={Jibril Frej and Didier Schwab and Jean-Pierre Chevallet},
booktitle={LREC},
year={2020}
}
@inproceedings{Frej2020MlWikir,
title={MLWIKIR: A Python Toolkit for Building Large-scale Wikipedia-based Information Retrieval Datasets in Chinese, English, French, Italian, Japanese, Spanish and More},
author={Jibril Frej and Didier Schwab and Jean-Pierre Chevallet},
booktitle={CIRCLE},
year={2020}
}
```
|
picana/pascal-5_grid_5k_512x512 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 162593945.0
num_examples: 5000
download_size: 160392257
dataset_size: 162593945.0
---
# Dataset Card for "pascal-5_grid_5k_512x512"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
anan-2024/twitter_dataset_1713190460 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 19656
num_examples: 45
download_size: 13247
dataset_size: 19656
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
yanismiraoui/prompt_injections | ---
license: apache-2.0
annotations_creators:
- no-annotation
language:
- en
- fr
- de
- es
- pt
- it
- ro
multilinguality:
- multilingual
source_datasets:
- original
tags:
- prompt
- prompt injection
- jailbreak
- prompt leaking
- mode switching
---
# Dataset Card for Prompt Injections by <a style="display: inline;" href="https://yanismiraoui.github.io/"> Yanis Miraoui </a> 👋
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Prompts to handle with care](#prompts-to-handle-with-care)
)
## Dataset Description
This dataset of prompt injections enriches Large Language Models (LLMs) by providing task-specific examples and prompts, helping improve LLMs' performance and control their behavior.
### Dataset Summary
This dataset contains over 1000 rows of prompt injections in multiple languages. It contains examples of prompt injections using different techniques such as: prompt leaking, jailbreaking, and mode switching.
### Languages
The text in the dataset is in English, French, German, Spanish, Italian, Portuguese and Romanian.
## Dataset Structure
It consists of one column with the prompt injections examples.
## Considerations for Using the Data
### Prompts to handle with care
This dataset of prompts has to be handled with care as it contains examples of prompts meant to harm, mislead or jailbreak LLMs. The goal of this dataset is to mainly help better finetune and control LLMs. |
CyberHarem/clownpiece_touhou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of clownpiece/クラウンピース (Touhou)
This is the dataset of clownpiece/クラウンピース (Touhou), containing 500 images and their tags.
The core tags of this character are `blonde_hair, long_hair, hat, jester_cap, wings, fairy_wings, purple_headwear, red_eyes, bangs, very_long_hair, hair_between_eyes, polka_dot_headwear, pink_eyes, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 731.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/clownpiece_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 391.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/clownpiece_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1222 | 859.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/clownpiece_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 637.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/clownpiece_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1222 | 1.22 GiB | [Download](https://huggingface.co/datasets/CyberHarem/clownpiece_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/clownpiece_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 36 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, american_flag_dress, american_flag_legwear, polka_dot, short_sleeves, solo, looking_at_viewer, neck_ruff, smile, open_mouth, torch, star_print, fire, holding, striped_pantyhose, striped_dress, purple_eyes |
| 1 | 7 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, american_flag_dress, american_flag_legwear, blush, fairy, full_body, polka_dot, short_sleeves, signature, solo, star_print, striped_dress, striped_pantyhose, open_mouth, fang, smile, pink_headwear, simple_background |
| 2 | 9 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, american_flag_dress, blush_stickers, chibi, full_body, neck_ruff, open_mouth, polka_dot, short_sleeves, solo, star_print, striped_dress, striped_pants, :d, standing, fairy, american_flag_legwear |
| 3 | 7 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, american_flag_bikini, blush, looking_at_viewer, navel, small_breasts, solo, polka_dot, micro_bikini, open_mouth, smile, star_print, striped, white_background, no_wings, pink_headwear, simple_background, standing |
| 4 | 7 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, looking_at_viewer, navel, nipples, small_breasts, solo, blush, polka_dot, pussy, smile, completely_nude, simple_background, bar_censor, cowboy_shot, loli, transparent_wings, pink_headwear, standing, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | american_flag_dress | american_flag_legwear | polka_dot | short_sleeves | solo | looking_at_viewer | neck_ruff | smile | open_mouth | torch | star_print | fire | holding | striped_pantyhose | striped_dress | purple_eyes | blush | fairy | full_body | signature | fang | pink_headwear | simple_background | blush_stickers | chibi | striped_pants | :d | standing | american_flag_bikini | navel | small_breasts | micro_bikini | striped | white_background | no_wings | nipples | pussy | completely_nude | bar_censor | cowboy_shot | loli | transparent_wings |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------------------|:------------------------|:------------|:----------------|:-------|:--------------------|:------------|:--------|:-------------|:--------|:-------------|:-------|:----------|:--------------------|:----------------|:--------------|:--------|:--------|:------------|:------------|:-------|:----------------|:--------------------|:-----------------|:--------|:----------------|:-----|:-----------|:-----------------------|:--------|:----------------|:---------------|:----------|:-------------------|:-----------|:----------|:--------|:------------------|:-------------|:--------------|:-------|:--------------------|
| 0 | 36 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | X | X | | | X | X | | X | | | X | X | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 2 | 9 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | X | X | X | X | | X | | X | | X | | | | X | | | X | X | | | | | X | X | X | X | X | | | | | | | | | | | | | | |
| 3 | 7 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | | | X | | X | X | | X | X | | X | | | | | | X | | | | | X | X | | | | | X | X | X | X | X | X | X | X | | | | | | | |
| 4 | 7 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | | | X | | X | X | | X | | | | | | | | | X | | | | | X | X | | | | | X | | X | X | | | X | | X | X | X | X | X | X | X |
|
windaan/autotrain-data-ta-winda-ota-sentiment-analysis | ---
task_categories:
- text-classification
---
# AutoTrain Dataset for project: ta-winda-ota-sentiment-analysis
## Dataset Description
This dataset has been automatically processed by AutoTrain for project ta-winda-ota-sentiment-analysis.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"feat_reviewId": "11e13237-0fe6-40ae-b035-e6d6d0287a80",
"feat_userName": "Sulaiman",
"feat_userImage": "https://play-lh.googleusercontent.com/a-/AD_cMMQbSKYMfa0BWeV5LYPf0kZ1MV3PKx_VgYzByqUb5Q",
"text": "ok",
"target": 4,
"feat_thumbsUpCount": 0,
"feat_reviewCreatedVersion": "3.77.1",
"feat_at": "2023-05-27 01:49:05",
"feat_replyContent": "Hi, we are so grateful to get a lot of support from you. Hope you continue to enjoy our offers. If you have any feedback or suggestions, let us know on https://www.traveloka.com/contactus, our customer service would love to serve you in 24 hours. Thank you!",
"feat_repliedAt": "2023-05-27 02:12:14",
"feat_appVersion": "3.77.1",
"feat_sortOrder": "newest",
"feat_appId": "com.traveloka.android"
},
{
"feat_reviewId": "671f8bed-8371-490f-bc33-51034fc798f3",
"feat_userName": "Feri Yadi",
"feat_userImage": "https://play-lh.googleusercontent.com/a-/AD_cMMT7JhwvdqMkI84xvo_4HZ-2xV04Pvsn75E_SD3GoQ",
"text": "ok",
"target": 0,
"feat_thumbsUpCount": 0,
"feat_reviewCreatedVersion": "10.37.0",
"feat_at": "2023-05-08 02:38:38",
"feat_replyContent": "We apologize for any inconvenience this has caused you. Your experience is important to us. If there is something more we can help you with,\n\nplease write an email to googlesupport@agoda.com and include your phone number if you would prefer to be contacted by phone.\n\nOur team will review the information and contact you back as soon as possible.",
"feat_repliedAt": "2023-05-08 05:14:09",
"feat_appVersion": "10.37.0",
"feat_sortOrder": "newest",
"feat_appId": "com.agoda.mobile.consumer"
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"feat_reviewId": "Value(dtype='string', id=None)",
"feat_userName": "Value(dtype='string', id=None)",
"feat_userImage": "Value(dtype='string', id=None)",
"text": "Value(dtype='string', id=None)",
"target": "ClassLabel(names=['1', '2', '3', '4', '5'], id=None)",
"feat_thumbsUpCount": "Value(dtype='int64', id=None)",
"feat_reviewCreatedVersion": "Value(dtype='string', id=None)",
"feat_at": "Value(dtype='string', id=None)",
"feat_replyContent": "Value(dtype='string', id=None)",
"feat_repliedAt": "Value(dtype='string', id=None)",
"feat_appVersion": "Value(dtype='string', id=None)",
"feat_sortOrder": "Value(dtype='string', id=None)",
"feat_appId": "Value(dtype='string', id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 2826 |
| valid | 709 |
|
timyangyazhou/ubuntu_irc_kummerfeld_ft_20_window_last_5_pseudo | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: canon_name
dtype: string
- name: id
dtype: int64
- name: parents
sequence: int64
- name: children
sequence: int64
- name: messages
sequence: string
- name: prediction
dtype: string
splits:
- name: train
num_bytes: 81419322
num_examples: 63982
- name: dev
num_bytes: 3052013
num_examples: 2397
- name: test
num_bytes: 6263006
num_examples: 4783
download_size: 0
dataset_size: 90734341
---
# Dataset Card for "ubuntu_irc_kummerfeld_ft_20_window_last_5_pseudo"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
one-sec-cv12/chunk_199 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 20747904768.0
num_examples: 216016
download_size: 18800690248
dataset_size: 20747904768.0
---
# Dataset Card for "chunk_199"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
avneet/mini-platypus | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 1954199
num_examples: 1000
download_size: 1010551
dataset_size: 1954199
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Narsil/test | ---
benchmark: ttt
task: xxx
type: prediction
---
# Batch job
model_id: {model_id}
dataset_name: {job.dataset_name}
dataset_config: {job.dataset_config}
dataset_split: {job.dataset_split}
dataset_column: {job.dataset_column} |
open-llm-leaderboard/details_Changgil__k2s3_test_24001 | ---
pretty_name: Evaluation run of Changgil/k2s3_test_24001
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Changgil/k2s3_test_24001](https://huggingface.co/Changgil/k2s3_test_24001) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Changgil__k2s3_test_24001\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-15T07:38:41.232311](https://huggingface.co/datasets/open-llm-leaderboard/details_Changgil__k2s3_test_24001/blob/main/results_2024-02-15T07-38-41.232311.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5457607639419929,\n\
\ \"acc_stderr\": 0.03381228856533623,\n \"acc_norm\": 0.5506067592536232,\n\
\ \"acc_norm_stderr\": 0.03452302087358302,\n \"mc1\": 0.2864137086903305,\n\
\ \"mc1_stderr\": 0.015826142439502342,\n \"mc2\": 0.4357245447683409,\n\
\ \"mc2_stderr\": 0.01457057655258036\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5136518771331058,\n \"acc_stderr\": 0.014605943429860947,\n\
\ \"acc_norm\": 0.5571672354948806,\n \"acc_norm_stderr\": 0.014515573873348902\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6011750647281418,\n\
\ \"acc_stderr\": 0.004886559008754983,\n \"acc_norm\": 0.8069109739095798,\n\
\ \"acc_norm_stderr\": 0.003939155484500657\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411022,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411022\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5407407407407407,\n\
\ \"acc_stderr\": 0.04304979692464242,\n \"acc_norm\": 0.5407407407407407,\n\
\ \"acc_norm_stderr\": 0.04304979692464242\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5592105263157895,\n \"acc_stderr\": 0.04040311062490437,\n\
\ \"acc_norm\": 0.5592105263157895,\n \"acc_norm_stderr\": 0.04040311062490437\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6037735849056604,\n \"acc_stderr\": 0.030102793781791197,\n\
\ \"acc_norm\": 0.6037735849056604,\n \"acc_norm_stderr\": 0.030102793781791197\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5694444444444444,\n\
\ \"acc_stderr\": 0.04140685639111503,\n \"acc_norm\": 0.5694444444444444,\n\
\ \"acc_norm_stderr\": 0.04140685639111503\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.48554913294797686,\n\
\ \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.48554913294797686,\n\
\ \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.046550104113196177,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.046550104113196177\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3829787234042553,\n \"acc_stderr\": 0.03177821250236922,\n\
\ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.03177821250236922\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.335978835978836,\n \"acc_stderr\": 0.024326310529149138,\n \"\
acc_norm\": 0.335978835978836,\n \"acc_norm_stderr\": 0.024326310529149138\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.04134913018303317,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.04134913018303317\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6548387096774193,\n\
\ \"acc_stderr\": 0.027045746573534327,\n \"acc_norm\": 0.6548387096774193,\n\
\ \"acc_norm_stderr\": 0.027045746573534327\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4236453201970443,\n \"acc_stderr\": 0.03476725747649037,\n\
\ \"acc_norm\": 0.4236453201970443,\n \"acc_norm_stderr\": 0.03476725747649037\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6606060606060606,\n \"acc_stderr\": 0.03697442205031595,\n\
\ \"acc_norm\": 0.6606060606060606,\n \"acc_norm_stderr\": 0.03697442205031595\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6868686868686869,\n \"acc_stderr\": 0.033042050878136525,\n \"\
acc_norm\": 0.6868686868686869,\n \"acc_norm_stderr\": 0.033042050878136525\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7668393782383419,\n \"acc_stderr\": 0.03051611137147602,\n\
\ \"acc_norm\": 0.7668393782383419,\n \"acc_norm_stderr\": 0.03051611137147602\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5076923076923077,\n \"acc_stderr\": 0.025348006031534778,\n\
\ \"acc_norm\": 0.5076923076923077,\n \"acc_norm_stderr\": 0.025348006031534778\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.0279404571362284,\n \"acc_norm\":\
\ 0.3,\n \"acc_norm_stderr\": 0.0279404571362284\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.5462184873949579,\n \"acc_stderr\": 0.03233943468182088,\n\
\ \"acc_norm\": 0.5462184873949579,\n \"acc_norm_stderr\": 0.03233943468182088\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.03802039760107903,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.03802039760107903\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7357798165137615,\n \"acc_stderr\": 0.01890416417151019,\n \"\
acc_norm\": 0.7357798165137615,\n \"acc_norm_stderr\": 0.01890416417151019\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.41203703703703703,\n \"acc_stderr\": 0.03356787758160835,\n \"\
acc_norm\": 0.41203703703703703,\n \"acc_norm_stderr\": 0.03356787758160835\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7303921568627451,\n \"acc_stderr\": 0.031145570659486782,\n \"\
acc_norm\": 0.7303921568627451,\n \"acc_norm_stderr\": 0.031145570659486782\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7172995780590717,\n \"acc_stderr\": 0.02931281415395592,\n \
\ \"acc_norm\": 0.7172995780590717,\n \"acc_norm_stderr\": 0.02931281415395592\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6233183856502242,\n\
\ \"acc_stderr\": 0.032521134899291884,\n \"acc_norm\": 0.6233183856502242,\n\
\ \"acc_norm_stderr\": 0.032521134899291884\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5954198473282443,\n \"acc_stderr\": 0.043046937953806645,\n\
\ \"acc_norm\": 0.5954198473282443,\n \"acc_norm_stderr\": 0.043046937953806645\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591207,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591207\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n\
\ \"acc_stderr\": 0.04414343666854933,\n \"acc_norm\": 0.7037037037037037,\n\
\ \"acc_norm_stderr\": 0.04414343666854933\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6687116564417178,\n \"acc_stderr\": 0.03697983910025588,\n\
\ \"acc_norm\": 0.6687116564417178,\n \"acc_norm_stderr\": 0.03697983910025588\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7948717948717948,\n\
\ \"acc_stderr\": 0.026453508054040318,\n \"acc_norm\": 0.7948717948717948,\n\
\ \"acc_norm_stderr\": 0.026453508054040318\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7535121328224776,\n\
\ \"acc_stderr\": 0.01541130876968693,\n \"acc_norm\": 0.7535121328224776,\n\
\ \"acc_norm_stderr\": 0.01541130876968693\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6098265895953757,\n \"acc_stderr\": 0.026261677607806642,\n\
\ \"acc_norm\": 0.6098265895953757,\n \"acc_norm_stderr\": 0.026261677607806642\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3474860335195531,\n\
\ \"acc_stderr\": 0.015925564060208154,\n \"acc_norm\": 0.3474860335195531,\n\
\ \"acc_norm_stderr\": 0.015925564060208154\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.027826109307283686,\n\
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.027826109307283686\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.594855305466238,\n\
\ \"acc_stderr\": 0.027882383791325953,\n \"acc_norm\": 0.594855305466238,\n\
\ \"acc_norm_stderr\": 0.027882383791325953\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.027339546640662734,\n\
\ \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.027339546640662734\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3829787234042553,\n \"acc_stderr\": 0.02899908090480618,\n \
\ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.02899908090480618\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3891786179921773,\n\
\ \"acc_stderr\": 0.012452613934287012,\n \"acc_norm\": 0.3891786179921773,\n\
\ \"acc_norm_stderr\": 0.012452613934287012\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5183823529411765,\n \"acc_stderr\": 0.030352303395351964,\n\
\ \"acc_norm\": 0.5183823529411765,\n \"acc_norm_stderr\": 0.030352303395351964\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5375816993464052,\n \"acc_stderr\": 0.020170614974969758,\n \
\ \"acc_norm\": 0.5375816993464052,\n \"acc_norm_stderr\": 0.020170614974969758\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6326530612244898,\n \"acc_stderr\": 0.03086214492108756,\n\
\ \"acc_norm\": 0.6326530612244898,\n \"acc_norm_stderr\": 0.03086214492108756\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7263681592039801,\n\
\ \"acc_stderr\": 0.03152439186555402,\n \"acc_norm\": 0.7263681592039801,\n\
\ \"acc_norm_stderr\": 0.03152439186555402\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.031885780176863984,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.031885780176863984\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2864137086903305,\n\
\ \"mc1_stderr\": 0.015826142439502342,\n \"mc2\": 0.4357245447683409,\n\
\ \"mc2_stderr\": 0.01457057655258036\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7569060773480663,\n \"acc_stderr\": 0.012055665630431037\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2979529946929492,\n \
\ \"acc_stderr\": 0.012597932232914517\n }\n}\n```"
repo_url: https://huggingface.co/Changgil/k2s3_test_24001
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|arc:challenge|25_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|arc:challenge|25_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|gsm8k|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|gsm8k|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hellaswag|10_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hellaswag|10_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-15T06-14-12.620691.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-15T07-38-41.232311.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-15T07-38-41.232311.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- '**/details_harness|winogrande|5_2024-02-15T06-14-12.620691.parquet'
- split: 2024_02_15T07_38_41.232311
path:
- '**/details_harness|winogrande|5_2024-02-15T07-38-41.232311.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-15T07-38-41.232311.parquet'
- config_name: results
data_files:
- split: 2024_02_15T06_14_12.620691
path:
- results_2024-02-15T06-14-12.620691.parquet
- split: 2024_02_15T07_38_41.232311
path:
- results_2024-02-15T07-38-41.232311.parquet
- split: latest
path:
- results_2024-02-15T07-38-41.232311.parquet
---
# Dataset Card for Evaluation run of Changgil/k2s3_test_24001
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Changgil/k2s3_test_24001](https://huggingface.co/Changgil/k2s3_test_24001) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Changgil__k2s3_test_24001",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-15T07:38:41.232311](https://huggingface.co/datasets/open-llm-leaderboard/details_Changgil__k2s3_test_24001/blob/main/results_2024-02-15T07-38-41.232311.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5457607639419929,
"acc_stderr": 0.03381228856533623,
"acc_norm": 0.5506067592536232,
"acc_norm_stderr": 0.03452302087358302,
"mc1": 0.2864137086903305,
"mc1_stderr": 0.015826142439502342,
"mc2": 0.4357245447683409,
"mc2_stderr": 0.01457057655258036
},
"harness|arc:challenge|25": {
"acc": 0.5136518771331058,
"acc_stderr": 0.014605943429860947,
"acc_norm": 0.5571672354948806,
"acc_norm_stderr": 0.014515573873348902
},
"harness|hellaswag|10": {
"acc": 0.6011750647281418,
"acc_stderr": 0.004886559008754983,
"acc_norm": 0.8069109739095798,
"acc_norm_stderr": 0.003939155484500657
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411022,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411022
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5407407407407407,
"acc_stderr": 0.04304979692464242,
"acc_norm": 0.5407407407407407,
"acc_norm_stderr": 0.04304979692464242
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5592105263157895,
"acc_stderr": 0.04040311062490437,
"acc_norm": 0.5592105263157895,
"acc_norm_stderr": 0.04040311062490437
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6037735849056604,
"acc_stderr": 0.030102793781791197,
"acc_norm": 0.6037735849056604,
"acc_norm_stderr": 0.030102793781791197
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.04140685639111503,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.04140685639111503
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.48554913294797686,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.48554913294797686,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.046550104113196177,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.046550104113196177
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.03177821250236922,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.03177821250236922
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.335978835978836,
"acc_stderr": 0.024326310529149138,
"acc_norm": 0.335978835978836,
"acc_norm_stderr": 0.024326310529149138
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303317,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303317
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6548387096774193,
"acc_stderr": 0.027045746573534327,
"acc_norm": 0.6548387096774193,
"acc_norm_stderr": 0.027045746573534327
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4236453201970443,
"acc_stderr": 0.03476725747649037,
"acc_norm": 0.4236453201970443,
"acc_norm_stderr": 0.03476725747649037
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6606060606060606,
"acc_stderr": 0.03697442205031595,
"acc_norm": 0.6606060606060606,
"acc_norm_stderr": 0.03697442205031595
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6868686868686869,
"acc_stderr": 0.033042050878136525,
"acc_norm": 0.6868686868686869,
"acc_norm_stderr": 0.033042050878136525
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7668393782383419,
"acc_stderr": 0.03051611137147602,
"acc_norm": 0.7668393782383419,
"acc_norm_stderr": 0.03051611137147602
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5076923076923077,
"acc_stderr": 0.025348006031534778,
"acc_norm": 0.5076923076923077,
"acc_norm_stderr": 0.025348006031534778
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.0279404571362284,
"acc_norm": 0.3,
"acc_norm_stderr": 0.0279404571362284
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5462184873949579,
"acc_stderr": 0.03233943468182088,
"acc_norm": 0.5462184873949579,
"acc_norm_stderr": 0.03233943468182088
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.03802039760107903,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.03802039760107903
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7357798165137615,
"acc_stderr": 0.01890416417151019,
"acc_norm": 0.7357798165137615,
"acc_norm_stderr": 0.01890416417151019
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.41203703703703703,
"acc_stderr": 0.03356787758160835,
"acc_norm": 0.41203703703703703,
"acc_norm_stderr": 0.03356787758160835
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7303921568627451,
"acc_stderr": 0.031145570659486782,
"acc_norm": 0.7303921568627451,
"acc_norm_stderr": 0.031145570659486782
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7172995780590717,
"acc_stderr": 0.02931281415395592,
"acc_norm": 0.7172995780590717,
"acc_norm_stderr": 0.02931281415395592
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6233183856502242,
"acc_stderr": 0.032521134899291884,
"acc_norm": 0.6233183856502242,
"acc_norm_stderr": 0.032521134899291884
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5954198473282443,
"acc_stderr": 0.043046937953806645,
"acc_norm": 0.5954198473282443,
"acc_norm_stderr": 0.043046937953806645
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591207,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591207
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.04414343666854933,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.04414343666854933
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6687116564417178,
"acc_stderr": 0.03697983910025588,
"acc_norm": 0.6687116564417178,
"acc_norm_stderr": 0.03697983910025588
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7948717948717948,
"acc_stderr": 0.026453508054040318,
"acc_norm": 0.7948717948717948,
"acc_norm_stderr": 0.026453508054040318
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7535121328224776,
"acc_stderr": 0.01541130876968693,
"acc_norm": 0.7535121328224776,
"acc_norm_stderr": 0.01541130876968693
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6098265895953757,
"acc_stderr": 0.026261677607806642,
"acc_norm": 0.6098265895953757,
"acc_norm_stderr": 0.026261677607806642
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3474860335195531,
"acc_stderr": 0.015925564060208154,
"acc_norm": 0.3474860335195531,
"acc_norm_stderr": 0.015925564060208154
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.027826109307283686,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.027826109307283686
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.594855305466238,
"acc_stderr": 0.027882383791325953,
"acc_norm": 0.594855305466238,
"acc_norm_stderr": 0.027882383791325953
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.027339546640662734,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.027339546640662734
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.02899908090480618,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.02899908090480618
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3891786179921773,
"acc_stderr": 0.012452613934287012,
"acc_norm": 0.3891786179921773,
"acc_norm_stderr": 0.012452613934287012
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5183823529411765,
"acc_stderr": 0.030352303395351964,
"acc_norm": 0.5183823529411765,
"acc_norm_stderr": 0.030352303395351964
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5375816993464052,
"acc_stderr": 0.020170614974969758,
"acc_norm": 0.5375816993464052,
"acc_norm_stderr": 0.020170614974969758
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6326530612244898,
"acc_stderr": 0.03086214492108756,
"acc_norm": 0.6326530612244898,
"acc_norm_stderr": 0.03086214492108756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7263681592039801,
"acc_stderr": 0.03152439186555402,
"acc_norm": 0.7263681592039801,
"acc_norm_stderr": 0.03152439186555402
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.031885780176863984,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.031885780176863984
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2864137086903305,
"mc1_stderr": 0.015826142439502342,
"mc2": 0.4357245447683409,
"mc2_stderr": 0.01457057655258036
},
"harness|winogrande|5": {
"acc": 0.7569060773480663,
"acc_stderr": 0.012055665630431037
},
"harness|gsm8k|5": {
"acc": 0.2979529946929492,
"acc_stderr": 0.012597932232914517
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
TeamSODA/LibriTTS | ---
dataset_info:
features:
- name: audio
dtype: audio
splits:
- name: train
num_bytes: 8027118681.616
num_examples: 33236
download_size: 9205367507
dataset_size: 8027118681.616
---
# Usage
```
from datasets import load_dataset
dataset = load_dataset('TeamSODA/LibriTTS', streaming=True)
``` |
dmayhem93/agieval-gaokao-chinese | ---
dataset_info:
features:
- name: query
dtype: string
- name: choices
sequence: string
- name: gold
sequence: int64
splits:
- name: test
num_bytes: 833642
num_examples: 246
download_size: 371866
dataset_size: 833642
license: mit
---
# Dataset Card for "agieval-gaokao-chinese"
Dataset taken from https://github.com/microsoft/AGIEval and processed as in that repo.
MIT License
Copyright (c) Microsoft Corporation.
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE
@misc{zhong2023agieval,
title={AGIEval: A Human-Centric Benchmark for Evaluating Foundation Models},
author={Wanjun Zhong and Ruixiang Cui and Yiduo Guo and Yaobo Liang and Shuai Lu and Yanlin Wang and Amin Saied and Weizhu Chen and Nan Duan},
year={2023},
eprint={2304.06364},
archivePrefix={arXiv},
primaryClass={cs.CL}
} |
open-llm-leaderboard/details_cmarkea__bloomz-560m-sft-chat | ---
pretty_name: Evaluation run of cmarkea/bloomz-560m-sft-chat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [cmarkea/bloomz-560m-sft-chat](https://huggingface.co/cmarkea/bloomz-560m-sft-chat)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cmarkea__bloomz-560m-sft-chat\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-25T06:48:45.798590](https://huggingface.co/datasets/open-llm-leaderboard/details_cmarkea__bloomz-560m-sft-chat/blob/main/results_2023-10-25T06-48-45.798590.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.09626677852348993,\n\
\ \"em_stderr\": 0.003020633220463166,\n \"f1\": 0.1512867030201341,\n\
\ \"f1_stderr\": 0.0032234786448698083,\n \"acc\": 0.2675611681136543,\n\
\ \"acc_stderr\": 0.0070088865604407986\n },\n \"harness|drop|3\":\
\ {\n \"em\": 0.09626677852348993,\n \"em_stderr\": 0.003020633220463166,\n\
\ \"f1\": 0.1512867030201341,\n \"f1_stderr\": 0.0032234786448698083\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5351223362273086,\n\
\ \"acc_stderr\": 0.014017773120881597\n }\n}\n```"
repo_url: https://huggingface.co/cmarkea/bloomz-560m-sft-chat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|arc:challenge|25_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_25T06_48_45.798590
path:
- '**/details_harness|drop|3_2023-10-25T06-48-45.798590.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-25T06-48-45.798590.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_25T06_48_45.798590
path:
- '**/details_harness|gsm8k|5_2023-10-25T06-48-45.798590.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-25T06-48-45.798590.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hellaswag|10_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T03-35-59.039004.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T03-35-59.039004.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T03-35-59.039004.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_25T06_48_45.798590
path:
- '**/details_harness|winogrande|5_2023-10-25T06-48-45.798590.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-25T06-48-45.798590.parquet'
- config_name: results
data_files:
- split: 2023_10_04T03_35_59.039004
path:
- results_2023-10-04T03-35-59.039004.parquet
- split: 2023_10_25T06_48_45.798590
path:
- results_2023-10-25T06-48-45.798590.parquet
- split: latest
path:
- results_2023-10-25T06-48-45.798590.parquet
---
# Dataset Card for Evaluation run of cmarkea/bloomz-560m-sft-chat
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/cmarkea/bloomz-560m-sft-chat
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [cmarkea/bloomz-560m-sft-chat](https://huggingface.co/cmarkea/bloomz-560m-sft-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cmarkea__bloomz-560m-sft-chat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-25T06:48:45.798590](https://huggingface.co/datasets/open-llm-leaderboard/details_cmarkea__bloomz-560m-sft-chat/blob/main/results_2023-10-25T06-48-45.798590.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.09626677852348993,
"em_stderr": 0.003020633220463166,
"f1": 0.1512867030201341,
"f1_stderr": 0.0032234786448698083,
"acc": 0.2675611681136543,
"acc_stderr": 0.0070088865604407986
},
"harness|drop|3": {
"em": 0.09626677852348993,
"em_stderr": 0.003020633220463166,
"f1": 0.1512867030201341,
"f1_stderr": 0.0032234786448698083
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5351223362273086,
"acc_stderr": 0.014017773120881597
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CVdatasets/ImageNet15_animals_unbalanced_aug2 | ---
dataset_info:
features:
- name: labels
dtype:
class_label:
names:
'0': Italian_greyhound
'1': Coyote
'2': Beagle
'3': Rottweiler
'4': Hyena
'5': Greater_Swiss_Mountain_dog
'6': Triceratops
'7': French_bulldog
'8': Red_wolf
'9': Egyptian_cat
'10': Chihuahua
'11': Irish_terrier
'12': Tiger_cat
'13': White_wolf
'14': Timber_wolf
- name: img
dtype: image
- name: is_generated
dtype: bool
splits:
- name: validation
num_bytes: 60570648.125
num_examples: 1439
- name: train
num_bytes: 186912186.125
num_examples: 3735
download_size: 247404644
dataset_size: 247482834.25
---
# Dataset Card for "ImageNet15_animals_unbalanced_aug2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distilled-one-sec-cv12-each-chunk-uniq/chunk_133 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1286807944.0
num_examples: 250742
download_size: 1317953155
dataset_size: 1286807944.0
---
# Dataset Card for "chunk_133"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
fondant-ai/datacomp-small-clip | ---
license: cc-by-4.0
configs:
- config_name: embeddings
data_files: data/*.parquet
- config_name: id_mapping
data_files: id_mapping/*.parquet
task_categories:
- image-to-text
- image-to-image
tags:
- images
- CLIP
- embeddings
- FAISS
size_categories:
- 1M<n<10M
---
<p align="center">
<a href="https://github.com/ml6team/fondant">
<img src="https://raw.githubusercontent.com/ml6team/fondant/main/docs/art/fondant_banner.svg" width="600px"/>
</a>
</p>
<p align="center">
<i>
<b>Production-ready</b>
data processing made
<b>easy</b>
and
<b>shareable</b>
</i>
<br>
<a href="http://fondant.ai"><strong>Explore the Fondant docs »</strong></a>
<a href="https://discord.gg/HnTdWhydGp"><img alt="Discord" src="https://dcbadge.vercel.app/api/server/HnTdWhydGp?style=flat-square"></a>
</p>
# Dataset Card for fondant-ai/datacomp-small-clip
<!-- Provide a quick summary of the dataset. -->
This is a dataset containing image urls and their CLIP embeddings, based on the [datacomp_small](https://huggingface.co/datasets/mlfoundations/datacomp_small) dataset, and processed with [fondant](https://github.com/ml6team/fondant).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
Large (image) datasets are often unwieldy to use due to their sheer size. Assume for instance
that we would like to extract all the cat images from such a dataset. We would have to look at
every image to classify if it's a cat image or not. And if we want to extract all the dog images
next, we again need to look at every image.
Instead, we can look at every image once, and calculate a (CLIP) embedding representing its
content. Combining these embeddings into an index, we can efficiently search through the dataset
with a query, finding specific images, without having to look at each one.
![CLIP index](https://cdn-uploads.huggingface.co/production/uploads/6454cb0e1a543cf97b1b6fd6/Mgl9UAqiwJrV4WDb8Y2-k.png)
This is what LAION did for their [LAION-5b dataset](https://laion.ai/blog/laion-5b/), which made
it possible to use, like we did in our
[ControlNet example](https://github.com/ml6team/fondant-usecase-controlnet).
Unfortunately, the LAION-5b dataset and index have been
[taken offline](https://laion.ai/notes/laion-maintanence/) (temporarily) and there
[aren't any alternatives](https://github.com/rom1504/clip-retrieval/issues/324). This is
why we built an index for the Datacomp-12M dataset. While it is a lot smaller than LAION-5b, it
should already enable a lot of use cases again, and can hopefully be the start towards building
indices for more and larger datasets.
- **License:** cc-by-4.0
### Dataset Sources
<!-- Provide the basic links for the dataset. -->
- **Original data:** [datacomp_small](https://huggingface.co/datasets/mlfoundations/datacomp_small)
- **Repository:** [fondant-clip-index](https://github.com/ml6team/fondant-clip-index)
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
We provide an [example use case](https://github.com/ml6team/fondant-usecase-controlnet) which uses the FAISS index of this dataset to create a dataset of interior design images, used for the fine-tuning of a ControlNet model:
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
The data repository is structured as follows:
- [data/](https://huggingface.co/datasets/fondant-ai/datacomp-small-clip/viewer/embeddings): The dataset
containing ids, urls, and CLIP embeddings
- [faiss](https://huggingface.co/datasets/fondant-ai/datacomp-small-clip/blob/main/faiss):
The faiss index
- [id_mapping/](https://huggingface.co/datasets/fondant-ai/datacomp-small-clip/viewer/id_mapping):
The mapping of the faiss ids to the original urls
## Dataset Creation
We leveraged Fondant to generate the CLIP index and published the pipeline as a
[git repository](https://github.com/ml6team/fondant-clip-index). The pipeline consists of 4 steps:
- A [`load_from_hf_hub`](https://fondant.ai/en/stable/components/hub/#load_from_hf_hub#description)
operation that loads the
[datacomp_small](https://huggingface.co/datasets/mlfoundations/datacomp_small) dataset from
huggingface into the Fondant workspace and format.
- A [`download_images`](https://fondant.ai/en/stable/components/hub/#download_images#description)
operation which downloads the actual images from the urls in the dataset.
- A [`embed_images`](https://fondant.ai/en/stable/components/hub/#embed_images#description) operation which embeds the downloaded images using a CLIP model.
- A [`write_to_file`](https://fondant.ai/en/stable/components/hub/#write_to_file#description)
operation which writes the original urls and generated embeddings to the chosen destination.
After running the pipeline, we used [`autofaiss`](https://github.com/criteo/autofaiss) to build the
CLIP index.
### Execution details
### Download images
We downloaded the images with 32 cores in parallel, each opening up to 25 concurrent connections,
and achieved a success rate of 72%, resulting in 9.251.172 images.
The downloading was executed on a VM on GCP using the Fondant Docker runner. We originally
planned to run this on Vertex AI, but moved to a VM when noticing lower network bandwidth on Vertex.
The success rate can probably be further improved by setting up a faster DNS resolver.
### Embed images
We leveraged the
[`laion/CLIP-ViT-B-32-laion2B-s34B-b79K`](https://huggingface.co/laion/CLIP-ViT-B-32-laion2B-s34B-b79K)
CLIP model. We chose this model because of a couple of reasons. It is popular, which makes it
easy to use with existing embeddings, it is small, which makes it cheap to run, and it is an open
model trained on open data.
We appreciate any feedback on our choice of model, so we can take this into account if we
generate indices for larger datasets in the future.
The embedding was executed on 4 T4 GPUs on Google Cloud using our Vertex AI runner, with a batch
size of 32. The execution took 8:15 hours.
## Terms and Conditions
Under no circumstances can Fondant be held liable by a third party for (i) the accuracy or correctness of the content, (ii) an alleged infringement of intellectual property rights or (iii) any other alleged claim, action, injunction or suit resulting from the publication or use of the dataset.
## Dataset Card Contact
- Email: [info@fondant.ai](mailto:info@fondant.ai)
- Discord: [https://discord.gg/HnTdWhydGp](https://discord.gg/HnTdWhydGp) |
EgilKarlsen/AA_DistilRoBERTa_FT5 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
- name: label
dtype: string
splits:
- name: train
num_bytes: 80318780.21618997
num_examples: 26057
- name: test
num_bytes: 26774087.073587257
num_examples: 8686
download_size: 147163418
dataset_size: 107092867.28977722
---
# Dataset Card for "AA_DistilRoBERTa_FT5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jgouwar/cran-data-all | ---
dataset_info:
features:
- name: content
dtype: string
- name: filename
dtype: string
splits:
- name: train
num_bytes: 2496907300
num_examples: 368428
download_size: 813146140
dataset_size: 2496907300
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CVasNLPExperiments/OK-VQA_test_google_flan_t5_xl_mode_T_A_C_Q_rices_ns_5046 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
sequence: string
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_coco_classes_caption_module_random_text
num_bytes: 5449076
num_examples: 5046
- name: fewshot_0_clip_tags_ViT_L_14_with_openai_Attributes_ViT_L_14_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_coco_classes_caption_module_random_text
num_bytes: 5805316
num_examples: 5046
download_size: 2663600
dataset_size: 11254392
---
# Dataset Card for "OK-VQA_test_google_flan_t5_xl_mode_T_A_C_Q_rices_ns_5046"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mhb11/test-set | ---
dataset_info:
features:
- name: source
dtype: image
- name: prompt
dtype: string
- name: target
dtype: image
splits:
- name: train
num_bytes: 1907395.0
num_examples: 9
download_size: 639510
dataset_size: 1907395.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Vinnyyw/Maitesolo | ---
license: openrail
---
|
Gdot/clts | ---
dataset_info:
features:
- name: text
dtype: string
- name: summary
dtype: string
splits:
- name: train
num_bytes: 706157853
num_examples: 148317
- name: valid
num_bytes: 97794789
num_examples: 20393
- name: test
num_bytes: 78816630
num_examples: 16687
download_size: 593531838
dataset_size: 882769272
task_categories:
- summarization
language:
- zh
---
# Dataset Card for "clts"
[original link](https://github.com/lxj5957/CLTS-Dataset)
|
KETI-AIR/aihub_document_summarization | ---
license: apache-2.0
---
|
krisfu/awesome-llm-datasets-only-Chinese | ---
license: openrail
---
|
mteb/nq | ---
language:
- en
multilinguality:
- monolingual
task_categories:
- text-retrieval
source_datasets:
- nq
task_ids:
- document-retrieval
config_names:
- corpus
tags:
- text-retrieval
dataset_info:
- config_name: default
features:
- name: query-id
dtype: string
- name: corpus-id
dtype: string
- name: score
dtype: float64
splits:
- name: test
num_bytes: 133323
num_examples: 4201
- config_name: corpus
features:
- name: _id
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: corpus
num_bytes: 1381417863
num_examples: 2681468
- config_name: queries
features:
- name: _id
dtype: string
- name: text
dtype: string
splits:
- name: queries
num_bytes: 220472
num_examples: 3452
configs:
- config_name: default
data_files:
- split: test
path: qrels/test.jsonl
- config_name: corpus
data_files:
- split: corpus
path: corpus.jsonl
- config_name: queries
data_files:
- split: queries
path: queries.jsonl
--- |
irds/neuclir_1_zh | ---
pretty_name: '`neuclir/1/zh`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `neuclir/1/zh`
The `neuclir/1/zh` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/neuclir#neuclir/1/zh).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=3,179,209
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/neuclir_1_zh', 'docs')
for record in docs:
record # {'doc_id': ..., 'title': ..., 'text': ..., 'url': ..., 'time': ..., 'cc_file': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
|
pythainlp/thaisum | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- th
license:
- mit
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- summarization
- text-generation
- fill-mask
task_ids:
- language-modeling
- masked-language-modeling
paperswithcode_id: null
pretty_name: ThaiSum
---
# Dataset Card for ThaiSum
This dataset was forked from [thaisum](https://huggingface.co/datasets/thaisum) to HF hub.
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/nakhunchumpolsathien/ThaiSum
- **Repository:** https://github.com/nakhunchumpolsathien/ThaiSum
- **Paper:**
- **Leaderboard:**
- **Point of Contact:** https://github.com/nakhunchumpolsathien
### Dataset Summary
ThaiSum is a large-scale corpus for Thai text summarization obtained from several online news websites namely Thairath, ThaiPBS, Prachathai, and The Standard. This dataset consists of over 350,000 article and summary pairs written by journalists.
### Supported Tasks and Leaderboards
summarization, language modeling
### Languages
Thai
## Dataset Structure
### Data Instances
```
{'body': 'กีเก ซานเชซ ฟลอเรส\xa0 กุนซือเลือดกระทิงของทีมวัตฟอร์ด\xa0 เมินประเด็นจุดโทษปัญหาในเกมพรีเมียร์ลีก อังกฤษ นัดที่แตนอาละวาดเปิดบ้านพ่าย คริสตัล พาเลซ 0-1ชี้ทีมของเขาเล่นไม่ดีพอเอง,สำนักข่าวต่างประเทศรายงานวันที่ 27 ก.ย. ว่า กีเก ซานเชซ ฟลอเรส\xa0 ผู้จัดการทีมชาวสเปน ของ แตนอาละวาด วัตฟอร์ด\xa0 ยอมรับทีมของเขาเล่นได้ไม่ดีพอเอง ในเกมพรีเมียร์ลีก อังกฤษ นัดเปิดบ้านพ่าย อินทรีผงาด คริสตัล พาเลซ 0-1 เมื่อคืนวันอาทิตย์ที่ผ่านมา,เกมนี้จุดเปลี่ยนมาอยู่ที่การได้จุดโทษในช่วงครึ่งหลังของ คริสตัล พาเลซ ซึ่งไม่ค่อยชัดเจนเท่าไหร่ว่า อัลลัน นียอม นั้นไปทำฟาล์วใส่ วิลฟรีด ซาฮา ในเขตโทษหรือไม่ แต่ผู้ตัดสินก็ชี้เป็นจุดโทษ ซึ่ง โยอัน กาบาย สังหารไม่พลาด และเป็นประตูชัยช่วยให้ คริสตัล พาเลซ เอาชนะ วัตฟอร์ด ไป 1-0 และเป็นการพ่ายแพ้ในบ้านนัดแรกของวัตฟอร์ดในฤดูกาลนี้อีกด้วย,ฟลอเรส กล่าวว่า มันเป็นเรื่องยากในการหยุดเกมรุกของคริสตัล พาเลซ ซึ่งมันอึดอัดจริงๆสำหรับเรา เราเล่นกันได้ไม่ดีนักในตอนที่ได้ครองบอล เราต้องเล่นทางริมเส้นให้มากกว่านี้ เราไม่สามารถหยุดเกมสวนกลับของพวกเขาได้ และแนวรับของเราก็ยืนไม่เป็นระเบียบสักเท่าไหร่ในช่วงครึ่งแรก ส่วนเรื่องจุดโทษการตัดสินใจขั้นสุดท้ายมันอยู่ที่ผู้ตัดสิน ซึ่งมันเป็นการตัดสินใจที่สำคัญ ผมเองก็ไม่รู้ว่าเขาตัดสินถูกหรือเปล่า บางทีมันอาจเป็นจุดที่ตัดสินเกมนี้เลย แต่เราไม่ได้แพ้เกมนี้เพราะจุดโทษ เราแพ้ในวันนี้เพราะเราเล่นไม่ดีและคริสตัล พาเลซ เล่นดีกว่าเรา เราไม่ได้มีฟอร์มการเล่นที่ดีในเกมนี้เลย', 'summary': 'กีเก ซานเชซ ฟลอเรส กุนซือเลือดกระทิงของทีมวัตฟอร์ด เมินประเด็นจุดโทษปัญหาในเกมพรีเมียร์ลีก อังกฤษ นัดที่แตนอาละวาดเปิดบ้านพ่าย คริสตัล พาเลซ 0-1ชี้ทีมของเขาเล่นไม่ดีพอเอง', 'tags': 'พรีเมียร์ลีก,วัตฟอร์ด,คริสตัล พาเลซ,กีเก ซานเชซ ฟลอเรส,ข่าวกีฬา,ข่าว,ไทยรัฐออนไลน์', 'title': 'ฟลอเรส รับ วัตฟอร์ดห่วยเองเกมพ่ายพาเลซคาบ้าน', 'type': '', 'url': 'https://www.thairath.co.th/content/528322'}
```
### Data Fields
- `title`: title of article
- `body`: body of article
- `summary`: summary of article
- `type`: type of article, if any
- `tags`: tags of article, separated by `,`
- `url`: URL of article
### Data Splits
train/valid/test: 358868 / 11000 / 11000
## Dataset Creation
### Curation Rationale
Sequence-to-sequence (Seq2Seq) models have shown great achievement in text summarization. However, Seq2Seq model often requires large-scale training data to achieve effective results. Although many impressive advancements in text summarization field have been made, most of summarization studies focus on resource-rich languages. The progress of Thai text summarization is still far behind. The dearth of large-scale dataset keeps Thai text summarization in its infancy. As far as our knowledge goes, there is not a large-scale dataset for Thai text summarization available anywhere. Thus, we present ThaiSum, a large-scale corpus for Thai text summarization obtained from several online news websites namely Thairath, ThaiPBS, Prachathai, and The Standard.
### Source Data
#### Initial Data Collection and Normalization
We used a python library named Scrapy to crawl articles from several news websites namely Thairath, Prachatai, ThaiPBS and, The Standard. We first collected news URLs provided in their sitemaps. During web-crawling, we used HTML markup and metadata available in HTML pages to identify article text, summary, headline, tags and label. Collected articles were published online from 2014 to August 2020. <br> <br>
We further performed data cleansing process to minimize noisy data. We filtered out articles that their article text or summary is missing. Articles that contains article text with less than 150 words or summary with less than 15 words were removed. We also discarded articles that contain at least one of these following tags: ‘ดวง’ (horoscope), ‘นิยาย’ (novel), ‘อินสตราแกรมดารา’ (celebrity Instagram), ‘คลิปสุดฮา’(funny video) and ‘สรุปข่าว’ (highlight news). Some summaries were completely irrelevant to their original article texts. To eliminate those irrelevant summaries, we calculated abstractedness score between summary and its article text. Abstractedness score is written formally as: <br>
<center><a href="https://www.codecogs.com/eqnedit.php?latex=\begin{equation}&space;\frac{|S-A|}{r}&space;\times&space;100&space;\end{equation}" target="_blank"><img src="https://latex.codecogs.com/gif.latex?\begin{equation}&space;\frac{|S-A|}{r}&space;\times&space;100&space;\end{equation}" title="\begin{equation} \frac{|S-A|}{r} \times 100 \end{equation}" /></a></center><br>
<br>Where 𝑆 denotes set of article tokens. 𝐴 denotes set of summary tokens. 𝑟 denotes a total number of summary tokens. We omitted articles that have abstractedness score at 1-grams higher than 60%.
<br><br>
It is important to point out that we used [PyThaiNLP](https://github.com/PyThaiNLP/pythainlp), version 2.2.4, tokenizing engine = newmm, to process Thai texts in this study. It is challenging to tokenize running Thai text into words or sentences because there are not clear word/sentence delimiters in Thai language. Therefore, using different tokenization engines may result in different segment of words/sentences.
After data-cleansing process, ThaiSum dataset contains over 358,000 articles. The size of this dataset is comparable to a well-known English document summarization dataset, CNN/Dily mail dataset. Moreover, we analyse the characteristics of this dataset by measuring the abstractedness level, compassion rate, and content diversity. For more details, see [thaisum_exploration.ipynb](https://github.com/nakhunchumpolsathien/ThaiSum/blob/master/thaisum_exploration.ipynb).
#### Dataset Statistics
ThaiSum dataset consists of 358,868 articles. Average lengths of article texts and summaries are approximately 530 and 37 words respectively. As mentioned earlier, we also collected headlines, tags and labels provided in each article. Tags are similar to keywords of the article. An article normally contains several tags but a few labels. Tags can be name of places or persons that article is about while labels indicate news category (politic, entertainment, etc.). Ultimatly, ThaiSum contains 538,059 unique tags and 59 unique labels. Note that not every article contains tags or labels.
|Dataset Size| 358,868 | articles |
|:---|---:|---:|
|Avg. Article Length| 529.5 | words|
|Avg. Summary Length | 37.3 | words|
|Avg. Headline Length | 12.6 | words|
|Unique Vocabulary Size | 407,355 | words|
|Occurring > 10 times | 81,761 | words|
|Unique News Tag Size | 538,059 | tags|
|Unique News Label Size | 59 | labels|
#### Who are the source language producers?
Journalists of respective articles
### Annotations
#### Annotation process
`summary`, `type` and `tags` are created by journalists who wrote the articles and/or their publishers.
#### Who are the annotators?
`summary`, `type` and `tags` are created by journalists who wrote the articles and/or their publishers.
### Personal and Sensitive Information
All data are public news articles. No personal and sensitive information is expected to be included.
## Considerations for Using the Data
### Social Impact of Dataset
- News summarization in Thai
- Language modeling for Thai news
### Discussion of Biases
- [ThaiPBS](https://www.thaipbs.or.th/home) [receives funding from Thai government](https://www.bangkokbiznews.com/blog/detail/648740).
- [Thairath](https://www.thairath.co.th/) is known as [the most popular newspaper in Thailand](https://mgronline.com/onlinesection/detail/9620000058532); no clear political leaning.
- [The Standard](https://thestandard.co/) is a left-leaning online magazine.
- [Prachathai](https://prachatai.com/) is a left-leaning, human-right-focused news site.
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[@nakhunchumpolsathien](https://github.com/nakhunchumpolsathien/)
[@caramelWaffle](https://github.com/caramelWaffle)
### Licensing Information
MIT License
### Citation Information
```
@mastersthesis{chumpolsathien_2020,
title={Using Knowledge Distillation from Keyword Extraction to Improve the Informativeness of Neural Cross-lingual Summarization},
author={Chumpolsathien, Nakhun},
year={2020},
school={Beijing Institute of Technology}
```
### Contributions
Thanks to [@cstorm125](https://github.com/cstorm125) for adding this dataset. |
cvzion/dqg-dataset-v2024-03-28 | ---
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 58405
num_examples: 95
download_size: 24515
dataset_size: 58405
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
HamdanXI/cleaned_daily_dialog_sentence | ---
dataset_info:
features:
- name: dialogue
dtype: string
splits:
- name: train
num_bytes: 5434241
num_examples: 77350
download_size: 3467625
dataset_size: 5434241
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "cleaned_daily_dialog_sentence"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nexdata/200955_Sentences_Mandarin_Prosodic_Corpus_Data | ---
license: cc-by-nc-nd-4.0
---
## Description
4 prosodic hierarchies annotating for the 200000 carefully selected Chinese texts which involve news and colloquial sentences. The sentence length is appropriate with diversified sentence patterns. This can be used as a TTS front-end prosody prediction training data set.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1027?source=Huggingface
# Specifications
## Data content
prosodic annotation for 200,955 selected Chinese sentences
## Data scale
200,955 sentences
## Data source
all the text comes from the news and human conversation
## Annotation
4 prosodic hierarchies annotating
## Language
Chinese
## Application scenarios
speech synthesis
## Accuracy
not lower than 99%
# Licensing Information
Commercial License
|
ibranze/araproje_hellaswag_tr_conf_gpt2_nearestscore_true | ---
dataset_info:
features:
- name: ind
dtype: int32
- name: activity_label
dtype: string
- name: ctx_a
dtype: string
- name: ctx_b
dtype: string
- name: ctx
dtype: string
- name: endings
sequence: string
- name: source_id
dtype: string
- name: split
dtype: string
- name: split_type
dtype: string
- name: label
dtype: string
splits:
- name: validation
num_bytes: 162703.0
num_examples: 250
download_size: 87144
dataset_size: 162703.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_hellaswag_tr_conf_gpt2_nearestscore_true"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_wnli_plural_to_singular_human | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 2824
num_examples: 13
- name: test
num_bytes: 6104
num_examples: 23
- name: train
num_bytes: 18210
num_examples: 84
download_size: 19906
dataset_size: 27138
---
# Dataset Card for "MULTI_VALUE_wnli_plural_to_singular_human"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jbrendsel/ECTSum | ---
license: unknown
task_categories:
- summarization
configs:
- config_name: default
data_files:
- split: train
path: "data.csv"
- split: test
path: "test.csv"
- split: valid
path: "val.csv"
--- |
docxster/invoices-v3.2 | ---
dataset_info:
features:
- name: id
dtype: string
- name: words
sequence: string
- name: bboxes
sequence:
sequence: float64
- name: ner_tags
sequence: int64
- name: image_path
dtype: string
splits:
- name: train
num_bytes: 18396424
num_examples: 2443
- name: test
num_bytes: 7983416
num_examples: 1047
download_size: 16445312
dataset_size: 26379840
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
guardian_authorship | ---
annotations_creators:
- found
language_creators:
- found
language:
- en
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- multi-class-classification
- topic-classification
pretty_name: GuardianAuthorship
dataset_info:
- config_name: cross_topic_1
features:
- name: author
dtype:
class_label:
names:
'0': catherinebennett
'1': georgemonbiot
'2': hugoyoung
'3': jonathanfreedland
'4': martinkettle
'5': maryriddell
'6': nickcohen
'7': peterpreston
'8': pollytoynbee
'9': royhattersley
'10': simonhoggart
'11': willhutton
'12': zoewilliams
- name: topic
dtype:
class_label:
names:
'0': Politics
'1': Society
'2': UK
'3': World
'4': Books
- name: article
dtype: string
splits:
- name: train
num_bytes: 677054
num_examples: 112
- name: test
num_bytes: 1283126
num_examples: 207
- name: validation
num_bytes: 374390
num_examples: 62
download_size: 3100749
dataset_size: 2334570
- config_name: cross_genre_1
features:
- name: author
dtype:
class_label:
names:
'0': catherinebennett
'1': georgemonbiot
'2': hugoyoung
'3': jonathanfreedland
'4': martinkettle
'5': maryriddell
'6': nickcohen
'7': peterpreston
'8': pollytoynbee
'9': royhattersley
'10': simonhoggart
'11': willhutton
'12': zoewilliams
- name: topic
dtype:
class_label:
names:
'0': Politics
'1': Society
'2': UK
'3': World
'4': Books
- name: article
dtype: string
splits:
- name: train
num_bytes: 406144
num_examples: 63
- name: test
num_bytes: 1657512
num_examples: 269
- name: validation
num_bytes: 677054
num_examples: 112
download_size: 3100749
dataset_size: 2740710
- config_name: cross_topic_2
features:
- name: author
dtype:
class_label:
names:
'0': catherinebennett
'1': georgemonbiot
'2': hugoyoung
'3': jonathanfreedland
'4': martinkettle
'5': maryriddell
'6': nickcohen
'7': peterpreston
'8': pollytoynbee
'9': royhattersley
'10': simonhoggart
'11': willhutton
'12': zoewilliams
- name: topic
dtype:
class_label:
names:
'0': Politics
'1': Society
'2': UK
'3': World
'4': Books
- name: article
dtype: string
splits:
- name: train
num_bytes: 677054
num_examples: 112
- name: test
num_bytes: 1104764
num_examples: 179
- name: validation
num_bytes: 552752
num_examples: 90
download_size: 3100749
dataset_size: 2334570
- config_name: cross_topic_3
features:
- name: author
dtype:
class_label:
names:
'0': catherinebennett
'1': georgemonbiot
'2': hugoyoung
'3': jonathanfreedland
'4': martinkettle
'5': maryriddell
'6': nickcohen
'7': peterpreston
'8': pollytoynbee
'9': royhattersley
'10': simonhoggart
'11': willhutton
'12': zoewilliams
- name: topic
dtype:
class_label:
names:
'0': Politics
'1': Society
'2': UK
'3': World
'4': Books
- name: article
dtype: string
splits:
- name: train
num_bytes: 677054
num_examples: 112
- name: test
num_bytes: 927138
num_examples: 152
- name: validation
num_bytes: 730378
num_examples: 117
download_size: 3100749
dataset_size: 2334570
- config_name: cross_topic_4
features:
- name: author
dtype:
class_label:
names:
'0': catherinebennett
'1': georgemonbiot
'2': hugoyoung
'3': jonathanfreedland
'4': martinkettle
'5': maryriddell
'6': nickcohen
'7': peterpreston
'8': pollytoynbee
'9': royhattersley
'10': simonhoggart
'11': willhutton
'12': zoewilliams
- name: topic
dtype:
class_label:
names:
'0': Politics
'1': Society
'2': UK
'3': World
'4': Books
- name: article
dtype: string
splits:
- name: train
num_bytes: 374390
num_examples: 62
- name: test
num_bytes: 1283126
num_examples: 207
- name: validation
num_bytes: 677054
num_examples: 112
download_size: 3100749
dataset_size: 2334570
- config_name: cross_topic_5
features:
- name: author
dtype:
class_label:
names:
'0': catherinebennett
'1': georgemonbiot
'2': hugoyoung
'3': jonathanfreedland
'4': martinkettle
'5': maryriddell
'6': nickcohen
'7': peterpreston
'8': pollytoynbee
'9': royhattersley
'10': simonhoggart
'11': willhutton
'12': zoewilliams
- name: topic
dtype:
class_label:
names:
'0': Politics
'1': Society
'2': UK
'3': World
'4': Books
- name: article
dtype: string
splits:
- name: train
num_bytes: 374390
num_examples: 62
- name: test
num_bytes: 1407428
num_examples: 229
- name: validation
num_bytes: 552752
num_examples: 90
download_size: 3100749
dataset_size: 2334570
- config_name: cross_topic_6
features:
- name: author
dtype:
class_label:
names:
'0': catherinebennett
'1': georgemonbiot
'2': hugoyoung
'3': jonathanfreedland
'4': martinkettle
'5': maryriddell
'6': nickcohen
'7': peterpreston
'8': pollytoynbee
'9': royhattersley
'10': simonhoggart
'11': willhutton
'12': zoewilliams
- name: topic
dtype:
class_label:
names:
'0': Politics
'1': Society
'2': UK
'3': World
'4': Books
- name: article
dtype: string
splits:
- name: train
num_bytes: 374390
num_examples: 62
- name: test
num_bytes: 1229802
num_examples: 202
- name: validation
num_bytes: 730378
num_examples: 117
download_size: 3100749
dataset_size: 2334570
- config_name: cross_topic_7
features:
- name: author
dtype:
class_label:
names:
'0': catherinebennett
'1': georgemonbiot
'2': hugoyoung
'3': jonathanfreedland
'4': martinkettle
'5': maryriddell
'6': nickcohen
'7': peterpreston
'8': pollytoynbee
'9': royhattersley
'10': simonhoggart
'11': willhutton
'12': zoewilliams
- name: topic
dtype:
class_label:
names:
'0': Politics
'1': Society
'2': UK
'3': World
'4': Books
- name: article
dtype: string
splits:
- name: train
num_bytes: 552752
num_examples: 90
- name: test
num_bytes: 1104764
num_examples: 179
- name: validation
num_bytes: 677054
num_examples: 112
download_size: 3100749
dataset_size: 2334570
- config_name: cross_topic_8
features:
- name: author
dtype:
class_label:
names:
'0': catherinebennett
'1': georgemonbiot
'2': hugoyoung
'3': jonathanfreedland
'4': martinkettle
'5': maryriddell
'6': nickcohen
'7': peterpreston
'8': pollytoynbee
'9': royhattersley
'10': simonhoggart
'11': willhutton
'12': zoewilliams
- name: topic
dtype:
class_label:
names:
'0': Politics
'1': Society
'2': UK
'3': World
'4': Books
- name: article
dtype: string
splits:
- name: train
num_bytes: 552752
num_examples: 90
- name: test
num_bytes: 1407428
num_examples: 229
- name: validation
num_bytes: 374390
num_examples: 62
download_size: 3100749
dataset_size: 2334570
- config_name: cross_topic_9
features:
- name: author
dtype:
class_label:
names:
'0': catherinebennett
'1': georgemonbiot
'2': hugoyoung
'3': jonathanfreedland
'4': martinkettle
'5': maryriddell
'6': nickcohen
'7': peterpreston
'8': pollytoynbee
'9': royhattersley
'10': simonhoggart
'11': willhutton
'12': zoewilliams
- name: topic
dtype:
class_label:
names:
'0': Politics
'1': Society
'2': UK
'3': World
'4': Books
- name: article
dtype: string
splits:
- name: train
num_bytes: 552752
num_examples: 90
- name: test
num_bytes: 1051440
num_examples: 174
- name: validation
num_bytes: 730378
num_examples: 117
download_size: 3100749
dataset_size: 2334570
- config_name: cross_topic_10
features:
- name: author
dtype:
class_label:
names:
'0': catherinebennett
'1': georgemonbiot
'2': hugoyoung
'3': jonathanfreedland
'4': martinkettle
'5': maryriddell
'6': nickcohen
'7': peterpreston
'8': pollytoynbee
'9': royhattersley
'10': simonhoggart
'11': willhutton
'12': zoewilliams
- name: topic
dtype:
class_label:
names:
'0': Politics
'1': Society
'2': UK
'3': World
'4': Books
- name: article
dtype: string
splits:
- name: train
num_bytes: 730378
num_examples: 117
- name: test
num_bytes: 927138
num_examples: 152
- name: validation
num_bytes: 677054
num_examples: 112
download_size: 3100749
dataset_size: 2334570
- config_name: cross_topic_11
features:
- name: author
dtype:
class_label:
names:
'0': catherinebennett
'1': georgemonbiot
'2': hugoyoung
'3': jonathanfreedland
'4': martinkettle
'5': maryriddell
'6': nickcohen
'7': peterpreston
'8': pollytoynbee
'9': royhattersley
'10': simonhoggart
'11': willhutton
'12': zoewilliams
- name: topic
dtype:
class_label:
names:
'0': Politics
'1': Society
'2': UK
'3': World
'4': Books
- name: article
dtype: string
splits:
- name: train
num_bytes: 730378
num_examples: 117
- name: test
num_bytes: 1229802
num_examples: 202
- name: validation
num_bytes: 374390
num_examples: 62
download_size: 3100749
dataset_size: 2334570
- config_name: cross_topic_12
features:
- name: author
dtype:
class_label:
names:
'0': catherinebennett
'1': georgemonbiot
'2': hugoyoung
'3': jonathanfreedland
'4': martinkettle
'5': maryriddell
'6': nickcohen
'7': peterpreston
'8': pollytoynbee
'9': royhattersley
'10': simonhoggart
'11': willhutton
'12': zoewilliams
- name: topic
dtype:
class_label:
names:
'0': Politics
'1': Society
'2': UK
'3': World
'4': Books
- name: article
dtype: string
splits:
- name: train
num_bytes: 730378
num_examples: 117
- name: test
num_bytes: 1051440
num_examples: 174
- name: validation
num_bytes: 552752
num_examples: 90
download_size: 3100749
dataset_size: 2334570
- config_name: cross_genre_2
features:
- name: author
dtype:
class_label:
names:
'0': catherinebennett
'1': georgemonbiot
'2': hugoyoung
'3': jonathanfreedland
'4': martinkettle
'5': maryriddell
'6': nickcohen
'7': peterpreston
'8': pollytoynbee
'9': royhattersley
'10': simonhoggart
'11': willhutton
'12': zoewilliams
- name: topic
dtype:
class_label:
names:
'0': Politics
'1': Society
'2': UK
'3': World
'4': Books
- name: article
dtype: string
splits:
- name: train
num_bytes: 406144
num_examples: 63
- name: test
num_bytes: 1960176
num_examples: 319
- name: validation
num_bytes: 374390
num_examples: 62
download_size: 3100749
dataset_size: 2740710
- config_name: cross_genre_3
features:
- name: author
dtype:
class_label:
names:
'0': catherinebennett
'1': georgemonbiot
'2': hugoyoung
'3': jonathanfreedland
'4': martinkettle
'5': maryriddell
'6': nickcohen
'7': peterpreston
'8': pollytoynbee
'9': royhattersley
'10': simonhoggart
'11': willhutton
'12': zoewilliams
- name: topic
dtype:
class_label:
names:
'0': Politics
'1': Society
'2': UK
'3': World
'4': Books
- name: article
dtype: string
splits:
- name: train
num_bytes: 406144
num_examples: 63
- name: test
num_bytes: 1781814
num_examples: 291
- name: validation
num_bytes: 552752
num_examples: 90
download_size: 3100749
dataset_size: 2740710
- config_name: cross_genre_4
features:
- name: author
dtype:
class_label:
names:
'0': catherinebennett
'1': georgemonbiot
'2': hugoyoung
'3': jonathanfreedland
'4': martinkettle
'5': maryriddell
'6': nickcohen
'7': peterpreston
'8': pollytoynbee
'9': royhattersley
'10': simonhoggart
'11': willhutton
'12': zoewilliams
- name: topic
dtype:
class_label:
names:
'0': Politics
'1': Society
'2': UK
'3': World
'4': Books
- name: article
dtype: string
splits:
- name: train
num_bytes: 406144
num_examples: 63
- name: test
num_bytes: 1604188
num_examples: 264
- name: validation
num_bytes: 730378
num_examples: 117
download_size: 3100749
dataset_size: 2740710
---
# Dataset Card for "guardian_authorship"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [http://www.icsd.aegean.gr/lecturers/stamatatos/papers/JLP2013.pdf](http://www.icsd.aegean.gr/lecturers/stamatatos/papers/JLP2013.pdf)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 49.61 MB
- **Size of the generated dataset:** 38.98 MB
- **Total amount of disk used:** 88.59 MB
### Dataset Summary
A dataset cross-topic authorship attribution. The dataset is provided by Stamatatos 2013.
1- The cross-topic scenarios are based on Table-4 in Stamatatos 2017 (Ex. cross_topic_1 => row 1:P S U&W ).
2- The cross-genre scenarios are based on Table-5 in the same paper. (Ex. cross_genre_1 => row 1:B P S&U&W).
3- The same-topic/genre scenario is created by grouping all the datasts as follows.
For ex., to use same_topic and split the data 60-40 use:
train_ds = load_dataset('guardian_authorship', name="cross_topic_<<#>>",
split='train[:60%]+validation[:60%]+test[:60%]')
tests_ds = load_dataset('guardian_authorship', name="cross_topic_<<#>>",
split='train[-40%:]+validation[-40%:]+test[-40%:]')
IMPORTANT: train+validation+test[:60%] will generate the wrong splits because the data is imbalanced
* See https://huggingface.co/docs/datasets/splits.html for detailed/more examples
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### cross_genre_1
- **Size of downloaded dataset files:** 3.10 MB
- **Size of the generated dataset:** 2.74 MB
- **Total amount of disk used:** 5.84 MB
An example of 'train' looks as follows.
```
{
"article": "File 1a\n",
"author": 0,
"topic": 4
}
```
#### cross_genre_2
- **Size of downloaded dataset files:** 3.10 MB
- **Size of the generated dataset:** 2.74 MB
- **Total amount of disk used:** 5.84 MB
An example of 'validation' looks as follows.
```
{
"article": "File 1a\n",
"author": 0,
"topic": 1
}
```
#### cross_genre_3
- **Size of downloaded dataset files:** 3.10 MB
- **Size of the generated dataset:** 2.74 MB
- **Total amount of disk used:** 5.84 MB
An example of 'validation' looks as follows.
```
{
"article": "File 1a\n",
"author": 0,
"topic": 2
}
```
#### cross_genre_4
- **Size of downloaded dataset files:** 3.10 MB
- **Size of the generated dataset:** 2.74 MB
- **Total amount of disk used:** 5.84 MB
An example of 'validation' looks as follows.
```
{
"article": "File 1a\n",
"author": 0,
"topic": 3
}
```
#### cross_topic_1
- **Size of downloaded dataset files:** 3.10 MB
- **Size of the generated dataset:** 2.34 MB
- **Total amount of disk used:** 5.43 MB
An example of 'validation' looks as follows.
```
{
"article": "File 1a\n",
"author": 0,
"topic": 1
}
```
### Data Fields
The data fields are the same among all splits.
#### cross_genre_1
- `author`: a classification label, with possible values including `catherinebennett` (0), `georgemonbiot` (1), `hugoyoung` (2), `jonathanfreedland` (3), `martinkettle` (4).
- `topic`: a classification label, with possible values including `Politics` (0), `Society` (1), `UK` (2), `World` (3), `Books` (4).
- `article`: a `string` feature.
#### cross_genre_2
- `author`: a classification label, with possible values including `catherinebennett` (0), `georgemonbiot` (1), `hugoyoung` (2), `jonathanfreedland` (3), `martinkettle` (4).
- `topic`: a classification label, with possible values including `Politics` (0), `Society` (1), `UK` (2), `World` (3), `Books` (4).
- `article`: a `string` feature.
#### cross_genre_3
- `author`: a classification label, with possible values including `catherinebennett` (0), `georgemonbiot` (1), `hugoyoung` (2), `jonathanfreedland` (3), `martinkettle` (4).
- `topic`: a classification label, with possible values including `Politics` (0), `Society` (1), `UK` (2), `World` (3), `Books` (4).
- `article`: a `string` feature.
#### cross_genre_4
- `author`: a classification label, with possible values including `catherinebennett` (0), `georgemonbiot` (1), `hugoyoung` (2), `jonathanfreedland` (3), `martinkettle` (4).
- `topic`: a classification label, with possible values including `Politics` (0), `Society` (1), `UK` (2), `World` (3), `Books` (4).
- `article`: a `string` feature.
#### cross_topic_1
- `author`: a classification label, with possible values including `catherinebennett` (0), `georgemonbiot` (1), `hugoyoung` (2), `jonathanfreedland` (3), `martinkettle` (4).
- `topic`: a classification label, with possible values including `Politics` (0), `Society` (1), `UK` (2), `World` (3), `Books` (4).
- `article`: a `string` feature.
### Data Splits
| name |train|validation|test|
|-------------|----:|---------:|---:|
|cross_genre_1| 63| 112| 269|
|cross_genre_2| 63| 62| 319|
|cross_genre_3| 63| 90| 291|
|cross_genre_4| 63| 117| 264|
|cross_topic_1| 112| 62| 207|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@article{article,
author = {Stamatatos, Efstathios},
year = {2013},
month = {01},
pages = {421-439},
title = {On the robustness of authorship attribution based on character n-gram features},
volume = {21},
journal = {Journal of Law and Policy}
}
@inproceedings{stamatatos2017authorship,
title={Authorship attribution using text distortion},
author={Stamatatos, Efstathios},
booktitle={Proc. of the 15th Conf. of the European Chapter of the Association for Computational Linguistics},
volume={1}
pages={1138--1149},
year={2017}
}
```
### Contributions
Thanks to [@thomwolf](https://github.com/thomwolf), [@eltoto1219](https://github.com/eltoto1219), [@malikaltakrori](https://github.com/malikaltakrori) for adding this dataset. |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-markdown-72000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1074995
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
KhalfounMehdi/dermatology_anomaly_detection_vit | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': benign
'1': malignant
splits:
- name: train
num_bytes: 51521841.0
num_examples: 656
download_size: 51530132
dataset_size: 51521841.0
---
# Dataset Card for "dermatology_anomaly_detection_vit"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Elynora/exqa | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 775332
num_examples: 2836
download_size: 302698
dataset_size: 775332
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Francesco/pills-sxdht | ---
dataset_info:
features:
- name: image_id
dtype: int64
- name: image
dtype: image
- name: width
dtype: int32
- name: height
dtype: int32
- name: objects
sequence:
- name: id
dtype: int64
- name: area
dtype: int64
- name: bbox
sequence: float32
length: 4
- name: category
dtype:
class_label:
names:
'0': pills
'1': Cipro 500
'2': Ibuphil 600 mg
'3': Ibuphil Cold 400-60
'4': Xyzall 5mg
'5': blue
'6': pink
'7': red
'8': white
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- en
license:
- cc
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- object-detection
task_ids: []
pretty_name: pills-sxdht
tags:
- rf100
---
# Dataset Card for pills-sxdht
** The original COCO dataset is stored at `dataset.tar.gz`**
## Dataset Description
- **Homepage:** https://universe.roboflow.com/object-detection/pills-sxdht
- **Point of Contact:** francesco.zuppichini@gmail.com
### Dataset Summary
pills-sxdht
### Supported Tasks and Leaderboards
- `object-detection`: The dataset can be used to train a model for Object Detection.
### Languages
English
## Dataset Structure
### Data Instances
A data point comprises an image and its object annotations.
```
{
'image_id': 15,
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=640x640 at 0x2373B065C18>,
'width': 964043,
'height': 640,
'objects': {
'id': [114, 115, 116, 117],
'area': [3796, 1596, 152768, 81002],
'bbox': [
[302.0, 109.0, 73.0, 52.0],
[810.0, 100.0, 57.0, 28.0],
[160.0, 31.0, 248.0, 616.0],
[741.0, 68.0, 202.0, 401.0]
],
'category': [4, 4, 0, 0]
}
}
```
### Data Fields
- `image`: the image id
- `image`: `PIL.Image.Image` object containing the image. Note that when accessing the image column: `dataset[0]["image"]` the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the `"image"` column, *i.e.* `dataset[0]["image"]` should **always** be preferred over `dataset["image"][0]`
- `width`: the image width
- `height`: the image height
- `objects`: a dictionary containing bounding box metadata for the objects present on the image
- `id`: the annotation id
- `area`: the area of the bounding box
- `bbox`: the object's bounding box (in the [coco](https://albumentations.ai/docs/getting_started/bounding_boxes_augmentation/#coco) format)
- `category`: the object's category.
#### Who are the annotators?
Annotators are Roboflow users
## Additional Information
### Licensing Information
See original homepage https://universe.roboflow.com/object-detection/pills-sxdht
### Citation Information
```
@misc{ pills-sxdht,
title = { pills sxdht Dataset },
type = { Open Source Dataset },
author = { Roboflow 100 },
howpublished = { \url{ https://universe.roboflow.com/object-detection/pills-sxdht } },
url = { https://universe.roboflow.com/object-detection/pills-sxdht },
journal = { Roboflow Universe },
publisher = { Roboflow },
year = { 2022 },
month = { nov },
note = { visited on 2023-03-29 },
}"
```
### Contributions
Thanks to [@mariosasko](https://github.com/mariosasko) for adding this dataset. |
iloncka/qa_program_modules_docs | ---
annotations_creators:
- no-annotation
language:
- ru
language_creators:
- machine-generated
- found
license:
- afl-3.0
multilinguality:
- monolingual
pretty_name: qapmdocs
size_categories:
- 10K<n<100K
source_datasets:
- original
tags:
- program modules descriptions
task_categories:
- question-answering
task_ids:
- closed-domain-qa
---
|
Vinibarcley/Larivoice | ---
license: openrail
---
|
edbeeching/gia-dataset-parquet-debug-mujoco | ---
dataset_info:
config_name: mujoco-ant
features:
- name: continuous_observations
sequence:
sequence: float32
length: 27
- name: continuous_actions
sequence:
sequence: float32
length: 8
- name: rewards
sequence: float32
splits:
- name: test
num_bytes: 288024
num_examples: 2
- name: train
num_bytes: 288024
num_examples: 2
download_size: 858378
dataset_size: 576048
configs:
- config_name: mujoco-ant
data_files:
- split: test
path: mujoco-ant/test-*
- split: train
path: mujoco-ant/train-*
---
# Dataset Card for "gia-dataset-parquet-debug-mujoco"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mixedbread-ai/wikipedia-data-en-2023-11 | ---
dataset_info:
features:
- name: _id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 20346504450
num_examples: 41488110
download_size: 10094783514
dataset_size: 20346504450
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
arieg/bw_spec_cls_4_10_s_200 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '821'
'1': '822'
'2': '825'
'3': '853'
splits:
- name: train
num_bytes: 43884906.0
num_examples: 800
- name: test
num_bytes: 1117368.0
num_examples: 20
download_size: 38172148
dataset_size: 45002274.0
---
# Dataset Card for "bw_spec_cls_4_10_s_200"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hippocrates/OphthoFillIN_train | ---
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 15076037
num_examples: 18389
- name: valid
num_bytes: 1938116
num_examples: 2298
- name: test
num_bytes: 1938116
num_examples: 2298
download_size: 6134435
dataset_size: 18952269
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
---
|
mole-code/org.springframework.ai | ---
dataset_info:
features:
- name: code
dtype: string
- name: apis
sequence: string
- name: extract_api
dtype: string
splits:
- name: train
num_bytes: 814829
num_examples: 173
- name: test
num_bytes: 202532
num_examples: 44
download_size: 304159
dataset_size: 1017361
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
ozayezerceli/NodeSelectionDataset | ---
license: apache-2.0
---
|
iamnguyen/ds_by_sys_prompt_14 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: system_prompt
dtype: string
- name: question
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 55078335.4912574
num_examples: 32293
download_size: 27776587
dataset_size: 55078335.4912574
---
# Dataset Card for "ds_by_sys_prompt_14"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zolak/twitter_dataset_78_1713135207 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 305793
num_examples: 789
download_size: 158761
dataset_size: 305793
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Taki135/OpenOrca_more_than_100_tokens | ---
language:
- en
dataset_info:
features:
- name: id
dtype: string
- name: system_prompt
dtype: string
- name: question
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 2927171424.10108
num_examples: 1716231
download_size: 2007021496
dataset_size: 2927171424.10108
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Shiveswarran/instruction_code_v9_man_dup_279 | ---
license: apache-2.0
---
|
camilaslz/helal | ---
license: openrail
---
|
Ammok/walmart_sales_prediction | ---
license: mit
---
|
AppleHarem/scavenger_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of scavenger (Arknights)
This is the dataset of scavenger (Arknights), containing 30 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
This is a WebUI contains crawlers and other thing: ([LittleAppleWebUI](https://github.com/LittleApple-fp16/LittleAppleWebUI))
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 30 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 80 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 85 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 30 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 30 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 30 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 80 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 80 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 69 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 85 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 85 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
dominguesm/positive-reframing-ptbr-dataset | ---
dataset_info:
features:
- name: original_text
dtype: string
- name: reframed_text
dtype: string
- name: strategy
dtype: string
- name: strategy_original_text
dtype: string
splits:
- name: dev
num_bytes: 318805
num_examples: 835
- name: test
num_bytes: 321952
num_examples: 835
- name: train
num_bytes: 2586935
num_examples: 6679
download_size: 1845244
dataset_size: 3227692
---
# positive-reframing-ptbr-dataset
Version translated into pt-br of the dataset available in the work ["Inducing Positive Perspectives with Text Reframing"](https://arxiv.org/abs/2204.02952). Used in model [positive-reframing-ptbr](https://huggingface.co/dominguesm/positive-reframing-ptbr).
**Citation:**
> Ziems, C., Li, M., Zhang, A., & Yang, D. (2022). Inducing Positive Perspectives with Text Reframing. In _Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (ACL)_.
**BibTeX:**
```tex
@inproceedings{ziems-etal-2022-positive-frames,
title = "Inducing Positive Perspectives with Text Reframing",
author = "Ziems, Caleb and
Li, Minzhi and
Zhang, Anthony and
Yang, Diyi",
booktitle = "Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics",
month = may,
year = "2022",
address = "Online and Dublin, Ireland",
publisher = "Association for Computational Linguistics"
}
``` |
gowd1/yarn1 | ---
license: bsl-1.0
---
|
nateraw/espeni-3 | ---
license:
- unknown
zenodo_id: '6606485'
converted_from: zenodo
---
# Dataset Card for Electrical half hourly raw and cleaned datasets for Great Britain from 2008-11-05
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://zenodo.org/record/6606485
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
<p><strong>A journal paper published in Energy Strategy Reviews details the method to create the data.</strong></p>
<p><strong>https://www.sciencedirect.com/science/article/pii/S2211467X21001280</strong></p>
<p> </p>
<p>2021-09-09: Version 6.0.0 was created. Now includes data for the North Sea Link (NSL) interconnector from Great Britain to Norway (https://www.northsealink.com). The previous version (5.0.4) should not be used - as there was an error with interconnector data having a static value over the summer 2021.</p>
<p> </p>
<p>2021-05-05: Version 5.0.0 was created. Datetimes now in ISO 8601 format (with capital letter 'T' between the date and time) rather than previously with a space (to RFC 3339 format) and with an offset to identify both UTC and localtime. MW values now all saved as integers rather than floats. Elexon data as always from www.elexonportal.co.uk/fuelhh, National Grid data from https://data.nationalgrideso.com/demand/historic-demand-data Raw data now added again for comparison of pre and post cleaning - to allow for training of additional cleaning methods. If using Microsoft Excel, the T between the date and time can be removed using the =SUBSTITUTE() command - and substitute "T" for a space " "</p>
<p>_____________________________________________________________________________________________________</p>
<p>2021-03-02: Version 4.0.0 was created. Due to a new interconnecter (IFA2 - https://en.wikipedia.org/wiki/IFA-2) being commissioned in Q1 2021, there is an additional column with data from National Grid - this is called 'POWER_NGEM_IFA2_FLOW_MW' in the espeni dataset. In addition, National Grid has dropped the column name 'FRENCH_FLOW' that used to provide the value for the column 'POWER_NGEM_FRENCH_FLOW_MW' in previous espeni versions. However, this has been changed to 'IFA_FLOW' in National Grid's original data, which is now called 'POWER_NGEM_IFA_FLOW_MW' in the espeni dataset. Lastly, the IO14 columns have all been dropped by National Grid - and potentially unlikely to appear again in future.</p>
<p>2020-12-02: Version 3.0.0 was created. There was a problem with earlier versions local time format - where the +01:00 value was not carried through into the data properly. Now addressed - therefore - local time now has the format e.g. 2020-03-31 20:00:00+01:00 when in British Summer Time.</p>
<p>2020-10-03: Version 2.0.0 was created as it looks like National Grid has had a significant change to the methodology underpinning the embedded wind calculations. The wind profile seems similar to previous values, but with an increasing value in comparison to the value published in earlier the greater the embedded value is. The 'new' values are from https://data.nationalgrideso.com/demand/daily-demand-update from 2013.</p>
<p>Previously: raw and cleaned datasets for Great Britain's publicly available electrical data from Elexon (www.elexonportal.co.uk) and National Grid (https://demandforecast.nationalgrid.com/efs_demand_forecast/faces/DataExplorer). Updated versions with more recent data will be uploaded with a differing version number and doi</p>
<p>All data is released in accordance with Elexon's disclaimer and reservation of rights.</p>
<p>https://www.elexon.co.uk/using-this-website/disclaimer-and-reservation-of-rights/</p>
<p>This disclaimer is also felt to cover the data from National Grid, and the parsed data from the Energy Informatics Group at the University of Birmingham.</p>
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
The class labels in the dataset are in English
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
This dataset was shared by Grant Wilson, Noah Godfrey
### Licensing Information
The license for this dataset is https://creativecommons.org/licenses/by-nc/4.0/legalcode
### Citation Information
```bibtex
@dataset{grant_wilson_2022_6606485,
author = {Grant Wilson and
Noah Godfrey},
title = {{Electrical half hourly raw and cleaned datasets
for Great Britain from 2008-11-05}},
month = jun,
year = 2022,
note = {{Grant funding as part of Research Councils (UK)
EP/L024756/1 - UK Energy Research Centre research
programme Phase 3 Grant funding as part of
Research Councils (UK) EP/V012053/1 - The Active
Building Centre Research Programme (ABC RP)}},
publisher = {Zenodo},
version = {6.0.9},
doi = {10.5281/zenodo.6606485},
url = {https://doi.org/10.5281/zenodo.6606485}
}
```
### Contributions
[More Information Needed] |
WahajRaza/Dermnet | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Acne and Rosacea Photos
'1': Actinic Keratosis Basal Cell Carcinoma and other Malignant Lesions
'2': Atopic Dermatitis Photos
'3': Bullous Disease Photos
'4': Cellulitis Impetigo and other Bacterial Infections
'5': Eczema Photos
'6': Exanthems and Drug Eruptions
'7': Hair Loss Photos Alopecia and other Hair Diseases
'8': Herpes HPV and other STDs Photos
'9': Light Diseases and Disorders of Pigmentation
'10': Lupus and other Connective Tissue diseases
'11': Melanoma Skin Cancer Nevi and Moles
'12': Nail Fungus and other Nail Disease
'13': Poison Ivy Photos and other Contact Dermatitis
'14': Psoriasis pictures Lichen Planus and related diseases
'15': Scabies Lyme Disease and other Infestations and Bites
'16': Seborrheic Keratoses and other Benign Tumors
'17': Systemic Disease
'18': Tinea Ringworm Candidiasis and other Fungal Infections
'19': Urticaria Hives
'20': Vascular Tumors
'21': Vasculitis Photos
'22': Warts Molluscum and other Viral Infections
splits:
- name: train
num_bytes: 1239882803.8663566
num_examples: 13223
- name: test
num_bytes: 219857588.92064318
num_examples: 2334
download_size: 1473388407
dataset_size: 1459740392.7869997
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
TeeA/text2sql_vi | ---
dataset_info:
features:
- name: schema_syll
dtype: string
- name: schema_word
dtype: string
- name: query_syll
dtype: string
- name: source
dtype: string
- name: question_syll
dtype: string
- name: question_word
dtype: string
- name: query_word
dtype: string
splits:
- name: train
num_bytes: 382949305
num_examples: 243964
download_size: 131810647
dataset_size: 382949305
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
yzhuang/autotree_automl_bank-marketing_gosdt_l256_d3_sd0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 2773600000
num_examples: 100000
- name: validation
num_bytes: 277360000
num_examples: 10000
download_size: 412140145
dataset_size: 3050960000
---
# Dataset Card for "autotree_automl_bank-marketing_gosdt_l256_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
junya1/shougi_kaisetu | ---
license: apache-2.0
---
|
Asap7772/persona_gpt4_paired_margin5 | ---
dataset_info:
features:
- name: x
dtype: string
- name: yw
dtype: string
- name: yl
dtype: string
- name: scorew
dtype: int64
- name: scorel
dtype: int64
- name: genw
dtype: string
- name: genl
dtype: string
- name: scorer
dtype: string
- name: scorer_id
dtype: int64
- name: scorerw_id
dtype: int64
- name: scorerl_id
dtype: int64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1668874420
num_examples: 519113
- name: test
num_bytes: 769074
num_examples: 238
download_size: 38697758
dataset_size: 1669643494
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_Sao10K__Stheno-L2-13B | ---
pretty_name: Evaluation run of Sao10K/Stheno-L2-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Sao10K/Stheno-L2-13B](https://huggingface.co/Sao10K/Stheno-L2-13B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sao10K__Stheno-L2-13B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T19:58:15.473819](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Stheno-L2-13B/blob/main/results_2023-09-17T19-58-15.473819.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2925755033557047,\n\
\ \"em_stderr\": 0.004659064029280355,\n \"f1\": 0.35764366610738435,\n\
\ \"f1_stderr\": 0.004568345368095279,\n \"acc\": 0.43558446671888545,\n\
\ \"acc_stderr\": 0.010545764058478083\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.2925755033557047,\n \"em_stderr\": 0.004659064029280355,\n\
\ \"f1\": 0.35764366610738435,\n \"f1_stderr\": 0.004568345368095279\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1197877179681577,\n \
\ \"acc_stderr\": 0.008944213403553058\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7513812154696132,\n \"acc_stderr\": 0.012147314713403105\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Sao10K/Stheno-L2-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|arc:challenge|25_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T19_58_15.473819
path:
- '**/details_harness|drop|3_2023-09-17T19-58-15.473819.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T19-58-15.473819.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T19_58_15.473819
path:
- '**/details_harness|gsm8k|5_2023-09-17T19-58-15.473819.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T19-58-15.473819.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hellaswag|10_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T22:32:10.395838.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T22:32:10.395838.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T22:32:10.395838.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T19_58_15.473819
path:
- '**/details_harness|winogrande|5_2023-09-17T19-58-15.473819.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T19-58-15.473819.parquet'
- config_name: results
data_files:
- split: 2023_08_31T22_32_10.395838
path:
- results_2023-08-31T22:32:10.395838.parquet
- split: 2023_09_17T19_58_15.473819
path:
- results_2023-09-17T19-58-15.473819.parquet
- split: latest
path:
- results_2023-09-17T19-58-15.473819.parquet
---
# Dataset Card for Evaluation run of Sao10K/Stheno-L2-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Sao10K/Stheno-L2-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Sao10K/Stheno-L2-13B](https://huggingface.co/Sao10K/Stheno-L2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Sao10K__Stheno-L2-13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T19:58:15.473819](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Stheno-L2-13B/blob/main/results_2023-09-17T19-58-15.473819.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.2925755033557047,
"em_stderr": 0.004659064029280355,
"f1": 0.35764366610738435,
"f1_stderr": 0.004568345368095279,
"acc": 0.43558446671888545,
"acc_stderr": 0.010545764058478083
},
"harness|drop|3": {
"em": 0.2925755033557047,
"em_stderr": 0.004659064029280355,
"f1": 0.35764366610738435,
"f1_stderr": 0.004568345368095279
},
"harness|gsm8k|5": {
"acc": 0.1197877179681577,
"acc_stderr": 0.008944213403553058
},
"harness|winogrande|5": {
"acc": 0.7513812154696132,
"acc_stderr": 0.012147314713403105
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
joey234/mmlu-computer_security | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: fewshot_context_neg
dtype: string
splits:
- name: dev
num_bytes: 4004
num_examples: 5
- name: test
num_bytes: 310872
num_examples: 100
download_size: 68214
dataset_size: 314876
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
---
# Dataset Card for "mmlu-computer_security"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |