Building The Khipu Field Guide (KFG) Database

Khipu Kamayoq by Felipe Guaman Poma de Ayala

A Counter Counts to a Count:

Between 1600 and 1615 A.D, a native Quechua speaking noble, named Felipe Guaman Poma de Ayala, wrote (and drew) a book-length account of native Inka life. He knew Spanish law allowed him to express his grievances of Spanish iniquities in a letter to the court of King Phillip II of Spain. This drawing, subsequently colorized, presents the scene of a khipu-kamayuq (a Khipu reader/maker) reciting a khipu to the Inka emperorSapa Inka Tupac Yupanki (topaynqa yupomqui). Tupac Yupanki was the son of emperor Pachakutiq, the great Sapa Inka conqueror who made the larger four-state Inka empire.

Their names are Quechua - Yupomki might have some association with Quechua yupay-nki meaning “your Counter”). Pachacutiq can be broken down as (cutiq aka kutiy-qa, one whose return kutiy, starts a new pacha a new time/universe). Intriguingly, the spelling Guaman Poma uses, with m’s instead of n’s, implies a southern Quechua dialect, not a native Cusco dialect.

The writing on the above right “administrador suyo-yoq - apu pomachaisa? translates as”An administrator of a suyu (one of the four states of the Inca empire known as tawa-n-tin-suyu)“. Unfortunately, my modern-day Quechua skills leave me unable to translate the last line, but my guess is that here apu pomachaisa refers to the sacred place (apu) Pomacocha. Located close to the great city of Vilcashuaman, Pomacocha was regarded as an apu, likely because Tupac Yupanki was born there.

It appears, to my eye, that this khipu-kamayuq is likely one of the empire’s four state-level khipu-kamayuqs - a very high official indeed. The title, clipped in the photograph, was likely COLLCA, meaning storehouse (Quechua qollqa), so this is likely a reading of goods stored at a state qollqa (indicated by all the domed buildings in the background and foreground) to the emperor at his birthplace.

1. Khipu Sources

Decipherment, Morality and the Genesis of the Harvard KDB

In 1989, using the new abilities of the computer, Oxford University Press issued its fully integrated second edition, incorporating all the changes and additions of the supplements in twenty rather more slender volumes. To help boost sales in the late seventies a two-volume set in a much-reduced typeface was issued, a powerful magnifying glass included in every slipcase. Then came a CD-ROM, and not long afterward the great work was further adapted for use on-line. A third edition, with a vast budget, was in the works.

There is some occasional carping that the work reflects an elitist, male, British, Victorian tone. Yet even in the admission that, like so many achievements of the era, it did reflect a set of attitudes not wholly harmonic with those prevalent at the end of the twentieth century, none seem to suggest that any other dictionary has ever come close, or will ever come close, to the achievement that it offers. It was the heroic creation of a legion of interested and enthusiastic men and women of wide general knowledge and interest; and it lives on today, just as lives the language of which it rightly claims to be a portrait.

Simon Winchester - The Professor and the Madman:
    A Tale of Murder, Insanity, and the Making of the Oxford English Dictionary

Like the Oxford dictionary, mentioned above, the Khipu Field Guide database is not the work of one or two people, but a group of enthusiastic khipu researchers. I’d like to call out the many people who measured and created the first digital records of these khipus. However, first, I need to preface with a discussion about “data sources” and morality.

The field of epigraphy/decipherment has many stories of “esteemed” scholars behaving badly. An example from history involves the decipherment of the Maya script. In the early 20th century, scholars were aware of the existence of the Maya script, but no one had been able to decipher it. In the 1950s, a young American scholar named Yuri Knorosov began to make progress in deciphering the script, using a combination of linguistic analysis and comparisons to known Maya inscriptions. Knorosov’s work was not immediately recognized by the Western academic establishment, in part because he was a communist and his work was seen as potentially supporting Soviet claims to influence in Central America. Meanwhile, a British scholar named Eric Thompson had been working on the Maya script for years but had made little progress.

Thompson, who was widely respected in the academic community, was determined to maintain his position as the leading Maya scholar, and he began to systematically undermine Knorosov’s work. Eventually, Knorosov’s decipherment of the Maya script was widely accepted, and Thompson’s prejudices were disposed of. However, Thompson’s actions had delayed the recognition of Knorosov’s work and contributed to a climate of mistrust and fear in the academic community. The pursuit of decipherment seems to breed personal ambition and political considerations that influence the interpretation of historical evidence. For me, Thompson’s lesson is to reemphasize the importance of integrity and accuracy in the pursuit of archaeological knowledge.

Pride, politics and citizenship are not the only biases. There is also gender. The 20th century architect Michael Ventris, who cracked ‘Linear B’, built his own work on “the unsung heroine Alice Kober.” Sadly, the khipu field, has recent, similar parallels. In this respect, the external question has been raised, should I be using khipu measurements based on a khipu measurer’s morality? Like the Oxford dictionary, some of the the Khipu Field Guide database comes from a professor who did not reflect a set of attitudes wholly harmonic with those prevalent at the end of the twentieth century. Nonetheless, the khipus exist, as do the legion of interested and enthusiastic women and men of wide general knowledge and interest. In this discussion of data sources and morality, I have been guided by the philosopher/ethicist Erich Hatala Matthes. In his book Drawing the Line: What to Do with the Work of Immoral Artists from Museums to the Movies, Matthes explores the issues of intrinsic and extrinsic values of a creator’s work and shows how ethically nuanced and careful we have to be, should we decide to:

a.) Choose to morally criticize an artefact
b.) Choose to consider the artefact maker’s morality, and
c.) Choose to prescribe how we should view or not view the artefact.

Each of these choices implies an ethical choice with many Socratic contradictions, and Matthes does his philosopher’s best to question every seemingly simple judgement about the willingness to choose to morally criticize.

One school of thought is that just as a Jackson Pollock drip painting has no evil intent, in situ, neither do khipu measurements. This school of thought says that canceling the khipu seems like an odd choice. Another school of thought says the artefact’s intent is only morally bad if you know the maker’s intent is morally bad. If you didn’t know what an arrowhead was used for, and only later discover it killed not a deer, but a baby, would you cancel your archaeological research? Another probe questions how morality changes over time. Would you cancel research into Greek pottery because it depicted homoerotic scenes? How about 100 years ago? In the future, will this work be criticized because I was a fossil-fuel climate-change inducing American? Matthes raises many of these Socratic questions.

In the end, I have chosen the Serenity Prayer - God grant me the serenity to accept the things I cannot change; the courage to change the things I can; and the wisdom to know the difference. In this approach, the prayer’s writer independently arrived at the same conclusion as Matthes - accept there was complicity, but choose a path forward using solidarity, with transparency and the courage to change the things I can:

  • Attribution and recognition of the original authors/measurers of khipus, in line with scientific convention. This includes the work of Marcia and Robert Ascher, Carrie Brezine, and Kylie Quave.
  • Restoration of evanescent khipus (khipus attributed as “being present and accounted for”, when in fact the data was absent), such as the work of Kylie Quave.
  • Restoration, typesetting, etc. of Ascher’s, Quave’s, etc. fieldnotes, which are as important as the khipu measurements themselves.
  • Reassertion of the importance of the work of Marcia Ascher’s khipu mathematics.

As for the wisdom to know the difference? I plead human.

Finally, a cautionary note about decipherment. I mentioned British architect and amateur linguist Michael Ventris and the Linear B script above. After years of study, Ventris became convinced that Linear B was an early form of Greek. This proved to be true, and was a major breakthrough in our understanding of ancient Greek civilization and language. However, the decipherment of Linear B also had unforeseen consequences. Having the language be “Greek” radically overturned a standard picture of the prehistoric Aegean in the second millennium BC — and one standard view of when, where, and how Greek speakers arrived on the scene. The implication of Ventris’s interpretation, in other words, went far beyond anything to do with how many cattle the Minoans had, or jars of olive oil they had collected. Instead it called into question the separateness and identities of the Minoans of Crete and the Mycenaeans of the mainland. Like the decipherment of our DNA to determine our ancestry, it is not surprising that people might be reluctant to face, just on the basis of an ambitious piece of decipherment of some receipts written on clay tablets, that they are not who they thought they were. Decipherment can have profound and unexpected consequences, and it often challenges long-held assumptions and beliefs.

Data Sources for the Khipu Field Guide (KFG)

The acronym KFG refers to this Khipu Field Guide database. Some of the KFG database was derived from data in the Harvard Khipu Database, often referred to as the KDB, a collection of khipu measurements from various authors, cataloged, and then converted into Excel files and a subsequent SQL database. Consequently, it is important to note this: The Khipu Field Guide (KFG) database is not a copy of the Harvard KDB. Although some data in the KFG is extracted from the KDB,and then vetted, cleansed and normalized, the KFG is a larger collection of khipu measurements and reference information, created from multiple sources, including:

  • The Harvard Khipu Database (KDB) - The Harvard KDB builds about half of its khipus from external sources such as Marcia Ascher’s Databooks, Hugo Pereyra’s publications, etc., The other half comes from the individual measurements, via Excel files and SQL tables, of both Gary Urton and Carrie Brezine. In total, their catalog provides the source material for approximately 490 khipus. This data requires considerable amounts of both programmatic and by-hand data-cleansing to be drawable. Approximately 20% of the khipus in the KDB are so malformed or incomplete, that they are unable to be used for further study - resulting in a loss of over 135 khipus from the original 623 nontrivial khipus in the KDB. As will be seen, many of those lost will be subsequently restored, either from the original author’s source or a combination of sources.
  • Journal articles and publications (Ascher) Marcia Ascher published her khipus in two forms, her databooks, and journal articles. The KFG includes AS001-009 from the Ascher’s publication and databook notes. From their databooks, I transcribed, and reformatted/typeset approximately 230 of Marcia and Robert Ascher’s detailed notes on their khipus into math-ready equations and text. This added a substantial amounts of contextual information, and guides for further research. Ascher’s khipus contained in the KDB have been assembled from her databooks, published articles, and from the KDB. Much modification had to be done to the KDB entries, based on information in the databooks to create a complete, well-formed set of khipus.
  • Researcher’s own Excel spreadsheets:
    • 22 previously unpublished khipus measured by Manuel Medrano.
    • 22 previously unpublished khipus measured by Kylie Quave. Although the KDB claims to have entered all of Kylie Quave’s 22 khipus, only 3 are actually reasonably correct. 16 are so malformed and truncated that they are useless. 3 are new, and only in the Khipu Field Guide. I am grateful that Kylie Quave’s original Excel files could be obtained, so that her work can be recognized properly.
    • 19 spreadsheets compiled from various sources, by hand, by Ashok Khosla, to reassemble malformed Pereyra and Urton khipus.
  • Jon Clindaniel’s Ph.D thesis at Harvard - Excel files used in reassembling ~80 malformed khipus in the KDB. Dr. Clindaniel’s files contain data not available in either the publicly available Harvard KDB, in either SQL or Excel form. While the data doesn’t show all of a khipu’s information, primary cord information, missing clusters, notes, etc., it does have enough to allow a reasonable khipu reconstruction, when augmented by other data. All of Dr. Clindaniel’s 23 khipus (JC001-JC0023) had been malformed in the Harvard SQL KDB, and thus unconstructable.
  • The Open Khipu Repository.The OKR, as it’s known, is a database that descended from the Harvard KDB. It has a new decolonial modernization of khipu naming where khipus are named sequentially upon discovery. Where possible, this new OKR name is mentioned in the tables and on individual khipu pages. However, I am used to the information value that hierarchical/set naming schemes bring (ie. Linnean taxonomy, URLs, etc.) the sequential OKR naming scheme has too many issues for me to adopt it as the primary identifier - it’s inflexible (the sequential naming scheme denies insertion), it’s not open (I can’t add new khipus to it), it’s not mnemonic, and it makes reading the history of the last 50 years of khipu harder. Consequently, due to the easier identification and comprehension of the traditional two letter naming scheme, and the fifty years of khipu literature using the old naming scheme, I have kept the old naming scheme as the primary identifier.

Alas, there is one more complication with this statement.

Manuel Medrano, identified (using the museum shoe leather approach) that 15 of the khipus in the Harvard KDB are duplicates. At present, this list includes:

# Most Recent Measurement Older Measurement Merged Modern Name
1 AS067/MA029 AS067/MA29 KH0080
2 HP035 (Missing from KFG) AS047 KH0058
3 HP036 AS038 KH0049
4 HP041 AS046 KH0057
5 MM1086 AS086 MM1086
6 UR035 AS070 KH0083
7 UR043 AS030 KH0032
8 UR044 AS031, UR1031 KH0033
9 UR083 AS208 KH0227
10 UR126 UR115 KH0350
11 UR133 UR036 KH0267
12 UR163 AS056 KH0067
13 UR176 LL01 KH0001
14 UR236 AS181 KH0197
15 UR281 AS068 KH0081

As you can see many of these are Ascher khipus remeasured by Gary Urton, or Hugo Pereyra. In a few cases they are measurements of khipu by Urton, that were already measured by Urton. Want some similarity match tests? Here you go!

Textual information (chiefly Ascher notes on Ascher khipus plus measurement notes on Urton khipus) are merged into new khipus using using a modern names as suggested by the OKR, shown above. This new naming scheme does not diminish the original authors contributions, except for Urton whose contributions diminish by 3.

Again, since no one source is sufficient, the Khipu Field Guide (KFG) has used multiple sources to assemble it’s own database. Some khipus have had to rely on multiple sources such as the Harvard KDB, AND the Ascher databooks, AND the Excel spreadsheets to completely assemble one khipu correctly. From these numerous sources I have scraped, reconciled, cross-referenced, and assembled 653, useful, non-trivial khipus.

Having taken over five years to curate, edit, and assemble,
the Khipu Field Guide Database is now the world’s largest well-formed khipu database.

I need to stress the phrase well-formed. To draw khipus, first, the khipu data has to hang together. Cord cluster groups have to have the right cords. Cords have to have the right knots, etc. This requirement, that the database have “referential” integrity, so that it can be properly drawn tests the quality of the database data, and immediately makes errors visually apparent (or not visible if it’s that kind of error!). Consequently, much work in the KFG database was in resolving khipus so that they had referential integrity as well as accuracy at a given level such as knots, cords, etc. Secondly, the khipu has to have interesting data - a primary cord is not enough.

2. Who are the Counters of the Counters?

This then, is the Current Count of Counters upon whose shoulders the Khipu Field Guide now stands:

There are, at present, 653 unique, well formed khipus. The original authors of these khipus are:
  • Gary Urton - (236 khipus) These khipus were assembled from the KDB, and Jon Clindaniel’s Excel files. Most of these files required further modifications and corrections to allow them to be analyzed and drawn.
  • Marcia Ascher - (229 khipus) Approximately 80 khipus first studied by the Aschers, were relabeled as Urton khipus by Urton. This action disregards scientific convention. I mention the original author, as a partial remedy, in the drawings.
  • Carrie Brezine - (61 khipus) Although labeled as Urton khipus, according to the Harvard KDB, 61 of the khipus were measured by Carrie Brezine. Again, to remedy this disregard for convention, I have mentioned the original author in the drawings.
  • Hugo Pereyra - (56 khipus) Assembled from a combination of the Harvard KDB, which was assembled from Pereyra’s publication and Jon Clindaniel’s Excel files to fix malformed khipus.
  • Jon Clindaniel - (23 khipus) Assembled from Jon Clindaniel’s Excel files to to obtain cord information and the Harvard KDB to obtain cluster information.
  • Manuel Medrano - (22 khipus) Assembled from Manuel Medrano’s previously unpublished Excel files.
  • Kylie Quave - (22 khipus) Assembled from Kylie Quave’s previously unpublished Excel files.
  • Sabine Hyland - (2 khipus) Assembled from Sabine Hyland’s previously unpublished Excel files.
  • Leland Locke - (1 khipu) Assembled from the Harvard KDB.
  • Carol Mackey - (1 khipu) Reconstructed from her 1970 Doctoral Thesis

3. Yak-Shaving: Assembling The Khipu Database

3.1 The Harvard Khipu Database

Prior to Urton’s banishment from Harvard, the Harvard KDB was easily obtained from the Harvard Khipu Database project. The khipu data was in two forms - a limited set of 349 khipus in Excel and a bigger SQL database of about 625+ khipu. I chose the larger database, a territory largely untraveled. So the steps were to:

  • Gather the data. In this case I started with the SQL data due to the larger data set size.
  • Cleanse/check the data . This is where it always gets hairy. I was unable to find any code that dealt with the SQL database directly. So I got the arrows in the back of virgin territory. The data fails in various ways. It took me a while to discover the many, many issues. From khipus with no cords, to khipus with no knots, to khipus with cords that didn’t exist, or belong to another khipu, to not knot’s that don’t exist, it’s been a fascinating Zen journey in the perils of data integrity. When crossing the bridge from archaeology to computation, I’ve learned it takes ENORMOUS amounts of data checking and time.

Along the journey, one night, tired and frustrated with data integrity checks, I saw a note in the cord database nudo desanudado. Unknotted knot. That’s what this cleansing has been like.

In the end, the Harvard Khipu Database, yielded about 490 well formed khipu. By back-filling from the original Harvard Excel files and Jon Clindaniel’s thesis files, I was able to reassemble another approximately 80 malformed khipus. An additional 19 khipus were assembled by hand from several sources and restored, by me, into “whole” khipus. Import of Kylie Quave’s khipus, extracted from her original Excel spreadsheets, completely replaced 16 khipus that were, in fact, empty, restored 3 partial khipus, and added 3 new khipus.

3.2 Data Gathering

The Harvard Khipu Database Project stored khipu measurements in two forms: Excel and a larger set as SQL tables. The SQL tables are almost…. ready for datamining. However, in an attempt to make this project portable (in a software sense, not a cloth one LOL) and to save hassles with a SQL server, etc. I am converting the SQL to CSV files. I note that Urton, prefix the database tables with the Quechua word for warehouse collca. A Quechua spelling is qollqa. I say one because Quechua orthography is dialectical. It could be collca, khollqa, kolka….

Since the khipu database/tables are so small (a total of 100Mb in SQL statements) I used an open-source mySQL, and TablePlus, a SQL GUI to:

  1. Restore the Khipu Project MySQL Database by concatenating all the SQL files:
    cat collca*.sql > make_khipu_db.sql
    and running the resulting SQL file on the MariaDB MySQL server.

  2. Save all the tables and query results of the Khipu Project MySQL Database as CSV files (using TablePlus)

    Voila - we now have a bunch of python pandas DataFrame ready CSV files.

The Harvard Khipu database schema (description image by the Khipu Database Project) is shown below. The python classes reconstruct this schema.

3.3 Creating the Initial Khipu DB

As said previously, the SQL database tables and query results are stored as a CSV tables instead of as SQL CREATE statements. Key tables include khipu_main, cord and cord cluster. Tables that end with dc are code descriptors for symbolic codes in tables. For example the ascher_color_dc tells you that color MB -> translates to Medium Brown…

3.4 Data Cleansing

We start by building a virginal object-oriented database (OODB) of khipus (essentially Python class objects). Building the initial Khipu OODB of about 620 khipus takes about 10 minutes.

3.4.1 Create fresh copies of CSV Database

# Load required libraries and intialize Jupyter notebook
# Khipu Imports
import khipu_utils as ku
import khipu_kamayuq as kamayuq  # A Khipu Maker is known (in Quechua) as a Khipu Kamayuq
import khipu_qollqa as kq

# Make a clean CSV directory to build  KFG Database from scratch. 
# Copy cleaner (i.e some minor data fixes like UR189 instead of Ur180) original CSV files to working directory. 
# Copy those files to 'clean' files which become the working CSV's
import shutil
CSV_dir = kq.qollqa_data_directory()
    #Which does this:
    #     os.system(f"cd {CSV_dir};cp collca_CSV/CSV_BEGIN/*.csv {CSV_dir}")
    #     shutil.copy(f"{CSV_dir}khipu_main.csv", f"{CSV_dir}khipu_main_clean.csv");
    #     shutil.copy(f"{CSV_dir}primary_cord.csv", f"{CSV_dir}primary_cord_clean.csv");
    #     shutil.copy(f"{CSV_dir}cord_cluster.csv", f"{CSV_dir}cord_cluster_clean.csv");
    #     shutil.copy(f"{CSV_dir}cord.csv", f"{CSV_dir}cord_clean.csv");
    #     shutil.copy(f"{CSV_dir}ascher_cord_color.csv", f"{CSV_dir}ascher_cord_color_clean.csv");
    #     shutil.copy(f"{CSV_dir}knot_cluster.csv", f"{CSV_dir}knot_cluster_clean.csv");
    #     shutil.copy(f"{CSV_dir}knot.csv", f"{CSV_dir}knot_clean.csv");

3.4.2 Create Foundation OODB

# Build a fresh version of the object oriented database (OODB) that starts with the "raw" database.
# Remove Duplicates along the way...
print("Building initial khipu OODB")
all_khipus = [aKhipu for aKhipu in kamayuq.fetch_all_khipus(clean_build=True).values()]
print(f"Done - built and fetched {len(all_khipus)} khipus")
Building initial khipu OODB
0: 1000166
25: 1000334
50: 1000364
75: 1000044
100: 1000143
125: 1000070
150: 1000421
175: 1000446
200: 1000581
225: 1000642
250: 1000023
275: 1000303
300: 1000266
325: 1000249
350: 1000340
375: 1000176
400: 1000057
425: 1000120
450: 1000291
475: 1000407
500: 1000412
Unable to create khipu id 1000484 - exception 1000484
525: 1000499
550: 1000524
575: 1000553
600: 1000605
625: 1000653
Done - built and fetched 595 khipus

3.5 Data Errors and Data Cleansing

Let’s start by looking at the big picture - what khipus do we have to work with. What’s the “quality” and “integrity” of the data. We’ve already had one khipu fail - Khipu ID 1000484, known as UR167 or B/3453A from the American Museum of Natural History.This failure happens because there is also a Khipu Id 1000474 that is known as UR167!

The database contains many many errors (Have I said that already? 😂). Some are structural, like mispointed cords, and some are transcription errors. I fix transcription errors in three locations:

  • Changing the SQL/CSV data - i.e. replacing the SQL data directly in the beginning CSVs. For example, note that khipu_main.csv (or the equivalent SQL table) has two errors - one, an empty row without any information (khipu id 10000500) and one mislableled investigator name Ur189. I deleted the empty row by hand, and edited the name to UR189 using MS Excel, prior to starting the database loading. As another example, I restore primary cord information missing from the tables for khipu AS014 in the primary cord CSV file using MS Excel.
  • Modifying the data as it’s being saved. For example, Ascher Cord Colors have hundreds of typos and non-regular codes that need cleansing and normalizing.
  • Replacing/fixing data in the code itself. The most common example, fixing “impossible” cord cluster information for AS014, AS024, AS094, AS187, and AS207B, whose clusters consist of things like 3 cords, starting at 35 cm, spaced 66 cm, apart on a 3 cm primary cord. Other examples include fixing incorrectly labeled top cords, the trimming of cord’s lengths (over 25’) for UR149, handling of knots with missing cords, long knots with num_turns=0, but values > 0, etc…

A quick glance of khipu_main, the top-level khipu dataframe:

khipu_main_df = pd.read_csv(f"{CSV_dir}khipu_main.csv") 
khipu_main_df = kq.clean_column_names(khipu_main_df)
khipu_id earliest_age latest_age provenance date_discovered discovered_by museum_descr museum_name nickname museum_num ... investigator_num complete created_by created_on changed_by changed_on duplicate_flag duplicate_id archive_num orig_inv_num
0 1000166 0000-00-00 0000-00-00 NaN 0000-00-00 NaN NaN "Niedersächsische Landesmuseum, Hanover, West ... NaN 6271 ... AS010 0.0 katie 5/24/12 13:33 NaN 0000-00-00 00:00:00 0.0 0.0 0.0 AS010
1 1000167 0000-00-00 0000-00-00 NaN 0000-00-00 NaN NaN "Niedersächsische Landesmuseum, Hanover, West ... NaN 10087 ... AS011 0.0 katie 5/24/12 13:33 NaN 0000-00-00 00:00:00 0.0 0.0 0.0 AS011
2 1000180 0000-00-00 0000-00-00 NaN 0000-00-00 NaN NaN "Niedersächsische Landesmuseum, Hanover, West ... NaN 10217 ... AS012 0.0 leah 5/24/12 13:33 leah 10/21/03 9:59 0.0 0.0 0.0 AS012
3 1000184 0000-00-00 0000-00-00 NaN 0000-00-00 NaN NaN "Niedersächsische Landesmuseum, Hanover, West ... NaN 10086 ... AS013 0.0 leah 5/24/12 13:33 leah 11/10/03 13:07 0.0 0.0 0.0 AS013
4 1000185 0000-00-00 0000-00-00 NaN 0000-00-00 NaN NaN British Museum NaN NaN ... AS014 0.0 leah 11/17/03 13:07 leah 11/17/03 13:09 0.0 0.0 0.0 AS014

5 rows × 22 columns

So we have 634 khipus to start with in the database. In our first pass at creating a database above, we were only able to create 595 khipus, with 28 culled as duplicates and 11 being culled due to zero cords, or data integrity issues.

As I have discovered, however, that’s not where the culling stops. A liberal amount of mispointered or incomplete records exist in the SQL database. Most of the issues have to due with cords pointing to the wrong place - for example Pendant Cord 1 belonging to Khipu 1 having a subsidiary cord that is attached to Khipu 2…which in turn has a subsidiary cord attached to Khipu 1, which in turn….

Let’s clean up funky values like ‘NaN’ (Not a Number):

khipu_df = khipu_main_df
khipu_df.museum_descr = khipu_df.museum_descr.fillna(value='')
khipu_df.nickname = khipu_df.nickname.fillna(value='')
khipu_df.provenance = khipu_df.provenance.fillna(value='')
khipu_df.provenance = np.where(khipu_df.provenance == 'unknown','Unknown', khipu_df.provenance)
khipu_df.provenance = np.where(khipu_df.provenance == '','Unknown', khipu_df.provenance)
khipu_df.region = khipu_df.region.fillna(value='')
khipu_df.region = np.where(khipu_df.region == 'unknown','Unknown', khipu_df.region)
khipu_df.region = np.where(khipu_df.region == '','Unknown', khipu_df.region)
khipu_df.conditionofkhipu = khipu_df.conditionofkhipu.fillna(value='')
print(f"Size of khipu dataframe is {khipu_df.shape}")
# khipu_df
Size of khipu dataframe is (634, 22)

Apparently some khipu are in fragmentary condition. Let’s remove those for the purpose of this study. Also the orig_inv_num meaning the original author who described the khipu generally matches with the investigator_num Some Ascher descriptions are replaced by Urton descriptions, but on the whole most Ascher descriptions are honored and labeled as such. In the khipu drawings, I display and restore the original investigator name from the palimpset labeling of Ascher and Brezine khipus by Urton.

fragmentary_khipus_df = khipu_df[khipu_df.conditionofkhipu == "Fragmentary"]
fragmentary_khipu_ids = list(fragmentary_khipus_df.khipu_id.values)
fragmentary_khipu_names = list(fragmentary_khipus_df.investigator_num.values)
print(f"fragmentary_khipu_names: {fragmentary_khipu_names}")
fragmentary_khipu_names: ['QU03', 'QU04', 'QU05', 'QU06', 'QU07', 'QU10', 'QU14', 'QU15', 'QU17', 'QU18', 'QU19']

3.6 Examining Primary Cord Data

We now have a clean khipu database with 635 khipus to investigate.
Most?! khipus have a primary cord. Let’s examine the primary cord database:

primary_cord_df = pd.read_csv(f"{CSV_dir}primary_cord.csv") 
# Once again, let's clean up the columns
primary_cord_df = kq.clean_column_names(primary_cord_df)
khipu_id pcord_id structure thickness notes attached_to pcord_length fiber termination beginning created_by created_date changed_by changed_date twist plainnotes
0 1000000 1000000 P 0.0 NaN 0.0 26.0 CN K K cbrezine 11/23/11 19:42 NaN 0000-00-00 00:00:00 S NaN
1 1000001 1000001 P 0.0 nudo de comienzo entre 0.0 - 0.5 cm 0.0 16.5 CN K T cbrezine 11/23/11 19:42 NaN 0000-00-00 00:00:00 S nudo de comienzo entre 0.0 - 0.5 cm
2 1000002 1000002 P 0.0 solamente existe cordon principal entre: 0.0-5... 0.0 10.5 CL NaN NaN cbrezine 11/23/11 19:42 NaN 0000-00-00 00:00:00 S solamente existe cordon principal entre: 0.0-5...
3 1000003 1000003 P 0.0 4.0 cm: nudo que une khipu 109B con up Top Cor... 0.0 98.0 CN K K cbrezine 11/23/11 19:42 cbrezine 5/29/03 9:40 S 4.0 cm: nudo que une khipu 109B con up Top Cor...
4 1000004 1000004 P 0.0 65.5 cm: una prolongacion del cordon principal... 0.0 65.5 CN K T cbrezine 11/23/11 19:42 cbrezine 3/3/04 12:05 S 65.5 cm: una prolongacion del cordon principal...
... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ...
630 1000657 1000657 P 0.0 NaN NaN 25.0 CN K T gurton 2/26/18 12:29 gurton 2/26/18 12:29 NaN NaN
631 1000658 1000658 P 0.0 NaN NaN 44.0 CN B NB gurton 2/26/18 12:29 gurton 2/26/18 12:29 NaN NaN
632 1000659 1000659 P 0.0 NaN NaN 57.0 CN K T gurton 2/26/18 12:29 gurton 2/26/18 12:29 NaN NaN
633 1000660 1000660 P 0.0 NaN NaN 19.0 CN K K gurton 2/26/18 12:29 gurton 2/26/18 12:29 NaN NaN
634 1000661 1000661 P 0.0 NaN NaN 31.0 CN K T gurton 2/26/18 12:29 gurton 2/26/18 12:29 NaN NaN

635 rows × 16 columns

Two questions immediately are raised. Are there any primary cords that are not attached to a khipu? (In which case we should remove them). The notes for primary cords should be reviewed, as well.

Remove primary cords belonging to fragmentary khipus or to the null row…

print(f"Before: primary_cord_df.shape = {primary_cord_df.shape}")
errant_khipu_ids = list((set(primary_cord_df.khipu_id.values) - set(khipu_df.khipu_id.values)) - set(fragmentary_khipu_ids))
errant_khipu_names = khipu_main_df[khipu_main_df.khipu_id.isin(errant_khipu_ids)].investigator_num.values
print(f"Removing errant_khipu_ids {errant_khipu_ids}")
print(f"Removing errant_khipu_names {errant_khipu_names}")

khipu_ids = khipu_df.khipu_id.values
primary_cord_df = primary_cord_df[primary_cord_df.khipu_id.isin(khipu_ids)]
print(f"After: primary_cord_df.shape = {primary_cord_df.shape}")

primary_cord_khipu_ids = primary_cord_df.khipu_id.values
print(f"Before: khipu_df.shape = {khipu_df.shape}")
khipu_df = khipu_df[khipu_df.khipu_id.isin(primary_cord_khipu_ids)]
print(f"After: khipu_df.shape = {khipu_df.shape}")
Before: primary_cord_df.shape = (635, 16)
Removing errant_khipu_ids [1000498, 1000594]
Removing errant_khipu_names []
After: primary_cord_df.shape = (633, 16)
Before: khipu_df.shape = (634, 22)
After: khipu_df.shape = (633, 22)

3.7 Cords and Cord Clusters

A few khipus have no cords. The khipu_kamayuq fetch routine filters out zero-cord khipus, so this step is redundant, but I include it here for occasional reference.

all_khipus = [aKhipu for aKhipu in kamayuq.fetch_all_khipus(remove_zero_cord_khipus=False).values()]
zero_cord_khipu_ids = [aKhipu.khipu_id for aKhipu in all_khipus if aKhipu.num_attached_cords()==0]
zero_cord_khipu_name = [kq.khipu_name_from_id(anId) for anId in zero_cord_khipu_ids]

print(f"Removing zero_cord_khipu_name {zero_cord_khipu_name}")
print(f"Before: khipu_df.shape = {khipu_df.shape}, Zero cord ids: {len(zero_cord_khipu_ids)}")
khipu_df = khipu_df[~khipu_df.khipu_id.isin(zero_cord_khipu_ids)]
print(f"After: khipu_df.shape = {khipu_df.shape}")
Removing zero_cord_khipu_name []
Before: khipu_df.shape = (633, 22), Zero cord ids: 0
After: khipu_df.shape = (633, 22)

Do the same for cords, cord clusters, and ascher_cord_colors

valid_khipu_ids = list(set(khipu_df.khipu_id.values) & set(kq.cord_cluster_df.khipu_id.values))
print(f"Before: cord_cluster_df.shape = {kq.cord_cluster_df.shape}")
cord_cluster_df = kq.cord_cluster_df[kq.cord_cluster_df.khipu_id.isin(valid_khipu_ids)]
print(f"After: cord_cluster_df.shape = {cord_cluster_df.shape}")

cord_df = pd.read_csv(f"{CSV_dir}cord.csv") 
cord_df = kq.clean_column_names(cord_df)

print(f"Before: cord_df.shape = {cord_df.shape}")
cord_df = cord_df[cord_df.khipu_id.isin(valid_khipu_ids)]
print(f"After: cord_df.shape = {cord_df.shape}")

ascher_cord_color_df = pd.read_csv(f"{CSV_dir}ascher_cord_color.csv") 
ascher_cord_color_df = kq.clean_column_names(ascher_cord_color_df)

# Ascher cord colors also point to primary cords (see pcord_flag)
print(f"Before: ascher_cord_color_df.shape = {ascher_cord_color_df.shape}")
valid_cord_color_ids = list(set(cord_df.cord_id.values) | set(primary_cord_df.pcord_id.values))
ascher_cord_color_df = ascher_cord_color_df[ascher_cord_color_df.cord_id.isin(valid_cord_color_ids)]

print(f"After: ascher_cord_color_df.shape = {ascher_cord_color_df.shape}")

# Many cords (1 in 6!) have NaN as their attached_to. What's up with that?
Before: cord_cluster_df.shape = (15699, 18)
After: cord_cluster_df.shape = (15699, 18)
Before: cord_df.shape = (56870, 25)
After: cord_df.shape = (55805, 25)
Before: ascher_cord_color_df.shape = (58609, 27)
After: ascher_cord_color_df.shape = (57341, 27)
(9253, 25)

3.8 Cord Clusters with Incorrect Cord Pointers

Some cords have missing parent cords. By comparing the pendant_from fields of cords versus the cord_id of clusters, I discovered that 44 khipu have cord clusters that point to cords that don’t belong to the khipu. For example, look at UR181/1000491 which has a cord cord_id=3052039 whose pendant_from 1000592 actually points to UR254/1000592

UR003 and UR149

Two of the khipus, UR003 and UR149 have excel files. On viewing the Excel files, I find that UR003 has 371 cords that have something in their fields, and a total of 146 cords that say nothing, while the database says it has 758 directly attached pendants, and 761 cord cluster pendants. UR149 says it has 256 to 265 cords, but the Excel spreadsheet says it has 272 cords. Clearly something’s wrong.

has_cord_parents_mask = cord_df.pendant_from.isin(cord_df.cord_id.values)
has_pcord_parents_mask = cord_df.pendant_from.isin(primary_cord_df.pcord_id.values)
has_parents_mask = (has_cord_parents_mask | has_pcord_parents_mask)
num_orphan_cords = sum(~has_parents_mask)
print(f"# of cords missing parents = {num_orphan_cords}")
print(f"Before: cord_df.shape = {cord_df.shape}")
cord_df = cord_df[has_parents_mask]
print(f"After: cord_df.shape = {cord_df.shape}")
# of cords missing parents = 288
Before: cord_df.shape = (55805, 25)
After: cord_df.shape = (55517, 25)

Some clusters have zero cords. What to do about these? For now we leave them in, and do defensive coding…

khipus_with_zero_cord_clusters = []
for khipu in all_khipus:
    for cluster in khipu.cord_clusters():
        if cluster.num_cords() == 0:
malformed_khipus = sorted(list(set(khipus_with_zero_cord_clusters)))
print(f"Khipus with zero cord clusters ({len(malformed_khipus)}): {malformed_khipus}")
Khipus with zero cord clusters (5): ['AS012', 'UR146', 'UR188', 'UR190', 'UR255']
clusters_from_cords = set(list(cord_df.cluster_id.values)) 
clusters_in_clusters = set(list(cord_cluster_df.cluster_id.values))
missing_cord_clusters = clusters_from_cords - clusters_in_clusters
def is_existing_cluster(aClusterID): return (aClusterID in clusters_in_clusters)
is_existing_cluster_mask = [is_existing_cluster(aClusterID) for aClusterID in cord_df.cluster_id.values]
cord_df = cord_df[is_existing_cluster_mask]


3.9 Rebuild DB

And finally a new rebuild

%%capture cell_output.txt
all_khipus = [aKhipu for aKhipu in kamayuq.fetch_all_khipus(clean_build=True).values()]

4. “Fixing” Malformed Harvard SQL Khipus:

Some cords have missing clusters. I found 82 khipus that had cords with malformed clusters. In most cases, they appear to be subsidiary cords, or cords of an unknown attachment (maybe unattached from the main khipu). I suspect the clusters failed to appear, when a translation of the Excel format for khipus was made to SQL.

After six weeks of work, I was able to reconstruct 77 of these malformed 82 khipu by splicing a combination of a publicly available set of excerpted Excel files from Jon Clindaniel’s Ph.D. Thesis, containing only knot and cord information and the validly placed, but otherwise incorrect pendants and clusters and primary cord from the SQL database. That work is documented here.

For each recoverable khipu, the SQL khipu is stripped down to it’s pendants, and then a new structure of knots and subsidiary cords is grafted onto that from the Excel spreadsheet for that khipu. This is all done by the utility class ExcelKhipu which updates all the CSV pandas files with new information.

Ultimately, these files have been saved in KFG Excel format, and we will reconstruct these files later when we read in pure KFG Excel files.

print("Khipus that were recoverable using Jon Clindaniel's Thesis files.")
# Recoverable Khipus
recoverable_khipu_rep = ku.multiline(", ".join(kq.recoverable_khipus()))
print(f" {recoverable_khipu_rep}")
Khipus that were recoverable using Jon Clindaniel's Thesis files.
 HP009, HP033, HP034, HP036, HP037, HP038, HP039, HP040, HP041, HP042, HP043, 
 HP044, HP046 A, HP046 B, HP047, HP051 A, HP053, HP054, HP057, JC001, JC002, 
 JC003, JC004, JC005, JC006, JC007, JC008, JC009, JC010, JC011, JC012, JC013, 
 JC014, JC015, JC016, JC017, JC018, JC019, JC020, JC021, JC022, JC023, UR196, 
 UR206, UR209, UR251, UR252, UR254, UR257, UR258, UR259, UR260, UR261, UR262, 
 UR263, UR266, UR267A, UR267B, UR268, UR269, UR270, UR271, UR272, UR273A, UR273B, 
 UR274A, UR275, UR276, UR277, UR278, UR279, UR281, UR284, UR288, UR293
#%%capture cell_output.txt
# Run as an offline process due to namespace issues
# import excel_khipu
# excel_khipu.reconcile_recoverable_khipus()
#os.system(f"cd {kq.project_directory()}/code/classes;python")
# kamayuq.rebuild_khipu_OODB()
# kq.build_KFG_qollqa()

# (khipu_dict, all_khipus) = kamayuq.fetch_khipus()
# kq.backup_qollqa(f"CSV_BASECAMP_3_{len(all_khipus)}")

After that offline process, let’s continue…


khipu_df = kq.khipu_df
primary_cord_df = kq.primary_cord_df
cord_cluster_df = kq.cord_cluster_df
cord_df = kq.cord_df
ascher_cord_color_df = kq.ascher_cord_color_df
knot_cluster_df = kq.knot_cluster_df
knot_df = kq.knot_df

5. Delete Orphaned Data

We now have lots of orphaned data. Let’s remove it.
  1. First delete all known bad khipus
  2. Then delete orphaned primary cords.
  3. Then delete clusters with no cords.
  4. Then delete ascher colors with no cords.
  5. Then delete orphaned knot clusters.
  6. Then delete orphaned knots.
# Make sure all khipu that are in deleted_khipus_df are deleted in khipu_df
deleted_khipus_df = pd.read_csv(f"{CSV_dir}/deleted_khipus.csv")
khipu_df = khipu_df[~ khipu_df.investigator_num.isin(]

# Delete orphaned primary cords
primary_cord_df = primary_cord_df[primary_cord_df.khipu_id.isin(khipu_df.khipu_id)]

#Remove cords and cord_clusters that have no khipus associated with them as a result of all this deletion
cord_cluster_df = cord_cluster_df[cord_cluster_df.khipu_id.isin(khipu_df.khipu_id.values)]
cord_df = cord_df[cord_df.khipu_id.isin(khipu_df.khipu_id.values)]

# Remove any ascher colors that have no cords associated with them
# Note that Ascher cord colors also point to primary cords (see pcord_flag)
print(f"Before: ascher_cord_color_df.shape = {ascher_cord_color_df.shape}")
valid_cord_color_ids = list(set(cord_df.cord_id.values) | set(primary_cord_df.pcord_id.values))
ascher_cord_color_df = ascher_cord_color_df[ascher_cord_color_df.cord_id.isin(valid_cord_color_ids)]

# Remove knot clusters and knots from previously eliminated khipu. 
print(f"Before: knot_cluster_df.shape = {knot_cluster_df.shape}")
knot_cluster_df = knot_cluster_df[knot_cluster_df.cord_id.isin(cord_df.cord_id.values)]
print(f"After: knot_cluster_df.shape = {knot_cluster_df.shape}")

# Remove orphaned knots
knot_df = knot_df[knot_df.cord_id.isin(cord_df.cord_id)]
Before: ascher_cord_color_df.shape = (57341, 28)
Before: knot_cluster_df.shape = (59506, 12)
After: knot_cluster_df.shape = (46723, 12)

6. Ascher Cord Colors

Many of the colors in the khipus are mal-formed, ill-formed, etc. For example YB:W and W:YB are the same cord color (mottled) but listed as separate colors. So normalize it so the cords are always sorted by grey-scale value. Similarly recode badly formed Ascher cord colors such as W**BS to W:BS.

import khipu_cord_color as kcc

# A sampling of bogus/transformed color descriptors
bogus_mixed_color_types = {'-':'PK', '-MB:AB':'MB:AB', 'BY':'YB', 'GGW':'W:GG', 'AB:AB-GG':'AB-GG', 'MB:MB-W':'MB-W', 'W-MB-MB':'W-MB', 'AB-AB-GG-GG-MB':'AB-GG-MB',
                           'W**SR':'W:SR', 'W**BS':'W:BS', 'W*BS':'W:BS', 'W**VB':'W:VB', 'GR***':'GR', 'KB**SY':'KB:SY', 'W-AB-AB-MB':'W-AB-MB', 'MB-MB-TG-TG':'MB-TG', 'RB(0-0)AB:MB(0-0)':'RB:AB:MB',
                           'W*BS*':'W:BS', 'W**SR':'W:SR', 'W-MB-MB':'W-MB', 'W**BS':'W:BS', 'BS*KB*':'BS:KB', 'ABW':'AB:W', 'MB-MB:W':'MB:W', 'AB-AB-MB':'AB-MB', 
                           'BS*DB*':'BS:DB', 'DB*VB*':'D:VB', 'DB*0G*':'DB:0G', 'DB*KB*':'DB:KB', 'DB*SR*':'DB:SR', 'AB-AB-KB':'AB-KB', 'BS*_SY*':'BS:SY', '*':'PK', 
                           'W-W-AB-KB':'W-AB-KB', 'AB-AB-KB-KB':'AB-KB', 'W-W-AB-MB':'W-AB-MB', 'W-W-KB-KB':'W-KB', 'MBABGG':'MB:AB:GG', 'AB-MB-MB':'AB-MB', 'W**VB*':'W:VB', 
                           'GR***':'GR', 'W-AB-AB-MB':'W-AB-MB', 'KB**SY':'KB:SY', 'W-DB-DB':'W-DB', '#NAME?':'PK', 'AB-AB-CB':'AB-CB', 'MB-MB-TG-TG':'MB-TG', 
                           'W-AB-AB-GG':'W-AB-GG', 'AB-AB-CB-CB':'AB-CB', 'LG-AB-AB':'LG-AB', 'AB-GG-GG-KB-KB':'AB-GG-KB', 'W-W-AB-AB':'W-AB', 'BS*SY*':'BS:SY',
                           'KB*SR*':'KB:SR', 'W*0G*':'W:0G', 'W*SR*':'W:SR', 'W*SY*':'W:SY', 'W*VB*':'W:VB', 'W*DB*':'W:DB', 'KB*W*':'KB:W', 'W*KB*':'W:KB', 'BS*VB*':'BS:VB',
                           'BS*0G*':'BS:0G', 'BS*SR*':'BS:SR', 'DB*SY*':'DB:SY', 'FB*W*':'FB:W', 'DB*BS*':'DB:BS', 'KB*BS*':'KB:BS', 'W*':'W', 'W**DB*':'W:DB', 'W*FB*':'W:FB',
                           'W*KB**':'W:KB', 'W*SR**':'W:SR', 'DB*GR*':'DB:GR', 'BS*GR*':'BS:GR', 'DB*BS**':'DB:BS', 'B***':'B', 'DB*_W*':'DB:W', 'DB*W*':'DB:W','DB*':'DB',
                           'W-W-KB':'W-KB', 'GLSRYBMG':'GL:SR:YB:MG', 'GLSRYBWMG':'GL:SR:YB:W:MG', 'W-AB-MB-MB': 'W-AB-MB', 'WGSRMG':'W:G:SR:MG',
                           'W-W-MB-HB-HB':'W-MB-HB', 'AB-AB-BG-KB':'AB-BG-KB', 'KBW':'KB:W', 'MBAB':'MB:AB', 'YB-YB-FB':'YB-FB', 
                           'W-W-MB-MB':'W-MB', 'W*_DB':'W:DB', 'W**DB':'W:DB', 'W:W:GG:KB':'W:GG:KB', 'ABKB':'AB:KB', 'MBCB':'MB:CB', 'RL-RL-FR':'RL-FR', 'W-AB-AB':'W-AB',
                           'AB-AB-HB-HB':'AB-AB-HB', 'GG-GG-MB-MB':'GG-MB', 'W-W-AB':'W-AB', 'W-MB-MB:W':'W-MB:W', 'AB-AB-MB-MB':'AB-MB', 'MB(0-0)RB(0-0)MB:RB(0-0)':'MB:RB:MB',
                           'FR(0-0)AB:MB(0-0)':'FR:AB:MB', 'FR:AB(0-0)AB-MB(0-0)':'FR:AB-MB', 'AB:LG(0-0)MB(0-0)':'AB:LG:MB', 'MB:AB(0-0)MB(0-0)':'MB:AB', 'BDW':'BD:W',
                           'AB:YB(0-0)AB(0-0)MB:AB(0-0)':'AB:YB:MB', 'BY(0-0)KB:BY(0-0)BY(0-0)':'BY:KB', 'RB:AB(0-0)AB(0-0)':'RB:AB', 'MB:W(0-9)W(9-41.5)':'MB:W',
                           'W:':'W', ':-W':'W', ':W-':'W', ':W-W':'W', 'W***':'W', 'W***':'W', }

# An illustration of the correction:
def well_formed_color(colorcode_descriptor):
    colorcode_descriptor = kcc.fix_color_code(colorcode_descriptor) # Transform bogus colors
    (pattern, color_codes, rgbcolors) = kcc.parse_color_code(colorcode_descriptor)
    # Make it so that MB:W and W:MB are the same (W:MB)
    if pattern == 'barberpole': 
        colorcode_descriptor = "-".join(sorted(color_codes, key=lambda x: (kcc.color_code_to_grey_value(x),x), reverse=True))
    elif pattern == 'mottled': 
        colorcode_descriptor = ":".join(sorted(color_codes, key=lambda x: (kcc.color_code_to_grey_value(x),x), reverse=True))
    elif pattern == 'striped': 
        colorcode_descriptor = "%".join(sorted(color_codes, key=lambda x: (kcc.color_code_to_grey_value(x),x), reverse=True))
    return colorcode_descriptor

# Update the Ascher Cord Color Dataframe with well-formed colors
well_formed_colors = [kcc.well_formed_color(x) for x in list(ascher_cord_color_df.full_color.values)]
ascher_cord_color_df['full_color'] = well_formed_colors

Once again, save the cleaned DataFrames and rebuild.

%%capture cell_output.txt

all_khipus = [aKhipu for aKhipu in kamayuq.fetch_all_khipus(clean_build=True).values()]

7. Importing Additional Khipus

7.1 Using the KFG Excel format

The Khipu Field Guide has its own khipu Excel file format - a six sheet Excel workbook that documents a khipu. Using this format, khipus from external sources can be translated into Excel and imported. These currently include:
  • 1 khipu reconstructed from Carol Mackey’s thesis (CM009)
  • 2 new khipus from Sabine Hyland (SH001, SH002)
  • 9 new khipus from Marcia and Robert Ascher’s journal article Numbers and Relations from Ancient Andean Quipus. Using the source from the original article I have imported their 9 khipus into the Khipu Field Guide (KFG). That work is documented here. The Ascher_Excel_Book class does all the heavy lifting and produces, after many transformations, a set of standardized KFG Excel Khipu files.
  • 22 new khipus via spreadsheets from Manuel Medrano
  • 22 new khipus via spreadsheets from Kylie Quave to completely replace malformed khipus originally referenced from the Harvard KDB
  • 42 khipus, rebuilt by hand by Ashok Khosla, via spreadsheets, from multiple sources, that completely replace malformed Pereyra and Urton khipus.
  • 77 khipus, rebuilt computationally by Ashok Khosla, via spreadsheets, from multiple sources, that completely replace malformed KDB SQL khipus, using information from Jon Clindaniel’s Ph.D. thesis.

First let’s list the khipus to import:

working_directory = f"{kq.project_directory()}/data/XLS/ADD_XLSX"
khipu_names = sorted(ku.basename_glob(r'[A-Z].*\.xlsx', working_directory))
khipu_rep = ku.multiline(", ".join([ku.basename(name) for name in khipu_names]))
print(f"Processing {len(khipu_names)} khipus: {khipu_rep}")
Processing 170 khipus: AS001, AS002, AS003, AS004, AS005, AS006, AS007, AS008, AS009, AS072, CM009, 
 HP009, HP033, HP034, HP037, HP038, HP039, HP040, HP042, HP043, HP044, HP045, 
 HP046_A, HP046_B, HP047, HP048, HP051_A, HP053, HP054, HP055, HP057, JC001, 
 JC002, JC003, JC004, JC005, JC006, JC007, JC008, JC009, JC010, JC011, JC012, 
 JC013, JC014, JC015, JC016, JC017, JC018, JC019, JC020, JC021, JC022, JC023, 
 KH0001, KH0032, KH0033, KH0049, KH0057, KH0058, KH0067, KH0080, KH0081, KH0083, 
 KH0197, KH0227, KH0267, KH0350, MM001, MM002, MM003, MM004, MM005, MM006_AN001, 
 MM007_AN002, MM008, MM009, MM010, MM011, MM012, MM013, MM014, MM015, MM016, 
 MM017, MM018, MM019, MM020, MM021, MM1086, QU001, QU002, QU003, QU004, QU005, 
 QU006, QU007, QU008, QU009, QU010, QU011, QU012, QU013, QU014, QU015, QU016, 
 QU017, QU018, QU019, QU020, QU021, QU022, SH001, SH002, UR001, UR004, UR017, 
 UR039, UR044, UR050, UR052, UR054, UR055, UR088, UR089, UR110, UR112, UR144, 
 UR155, UR165, UR167, UR190, UR193, UR196, UR206, UR209, UR221, UR251, UR252, 
 UR253, UR254, UR255, UR257, UR258, UR259, UR260, UR261, UR262, UR263, UR266, 
 UR267A, UR267B, UR268, UR269, UR270, UR271, UR272, UR273A, UR273B, UR274A, 
 UR275, UR276, UR277, UR278, UR279, UR280, UR284, UR288, UR292A, UR293

And then import them:

%%capture cell_output.txt
import warnings

import kfg_excel_reader
khipu_builder = kfg_excel_reader.KFG_Excel_Reader(working_directory, khipu_names, base_id=6000000, publish=True, run_silent=True)

all_khipus = [aKhipu for aKhipu in kamayuq.fetch_all_khipus(clean_build=True).values()]
KFG_Excel_Reader: Importing AS002.xlsx
KFG_Excel_Reader: Importing AS003.xlsx
KFG_Excel_Reader: Importing AS004.xlsx
KFG_Excel_Reader: Importing AS005.xlsx
KFG_Excel_Reader: Importing AS006.xlsx
KFG_Excel_Reader: Importing AS007.xlsx
KFG_Excel_Reader: Importing AS008.xlsx
KFG_Excel_Reader: Importing AS009.xlsx
KFG_Excel_Reader: Importing AS072.xlsx
KFG_Excel_Reader: Importing CM009.xlsx
KFG_Excel_Reader: Importing HP009.xlsx
KFG_Excel_Reader: Importing HP033.xlsx
KFG_Excel_Reader: Importing HP034.xlsx
KFG_Excel_Reader: Importing HP037.xlsx
KFG_Excel_Reader: Importing HP038.xlsx
KFG_Excel_Reader: Importing HP039.xlsx
KFG_Excel_Reader: Importing HP040.xlsx
KFG_Excel_Reader: Importing HP042.xlsx
KFG_Excel_Reader: Importing HP043.xlsx
KFG_Excel_Reader: Importing HP044.xlsx
KFG_Excel_Reader: Importing HP045.xlsx
KFG_Excel_Reader: Importing HP046_A.xlsx
KFG_Excel_Reader: Importing HP046_B.xlsx
KFG_Excel_Reader: Importing HP047.xlsx
KFG_Excel_Reader: Importing HP048.xlsx
KFG_Excel_Reader: Importing HP051_A.xlsx
KFG_Excel_Reader: Importing HP053.xlsx
KFG_Excel_Reader: Importing HP054.xlsx
KFG_Excel_Reader: Importing HP055.xlsx
KFG_Excel_Reader: Importing HP057.xlsx
KFG_Excel_Reader: Importing JC001.xlsx
KFG_Excel_Reader: Importing JC002.xlsx
KFG_Excel_Reader: Importing JC003.xlsx
KFG_Excel_Reader: Importing JC004.xlsx
KFG_Excel_Reader: Importing JC005.xlsx
KFG_Excel_Reader: Importing JC006.xlsx
KFG_Excel_Reader: Importing JC007.xlsx
KFG_Excel_Reader: Importing JC008.xlsx
KFG_Excel_Reader: Importing JC009.xlsx
KFG_Excel_Reader: Importing JC010.xlsx
KFG_Excel_Reader: Importing JC011.xlsx
KFG_Excel_Reader: Importing JC012.xlsx
KFG_Excel_Reader: Importing JC013.xlsx
KFG_Excel_Reader: Importing JC014.xlsx
KFG_Excel_Reader: Importing JC015.xlsx
KFG_Excel_Reader: Importing JC016.xlsx
KFG_Excel_Reader: Importing JC017.xlsx
KFG_Excel_Reader: Importing JC018.xlsx
KFG_Excel_Reader: Importing JC019.xlsx
KFG_Excel_Reader: Importing JC020.xlsx
KFG_Excel_Reader: Importing JC021.xlsx
KFG_Excel_Reader: Importing JC022.xlsx
KFG_Excel_Reader: Importing JC023.xlsx
KFG_Excel_Reader: Importing KH0001.xlsx
KFG_Excel_Reader: Importing KH0032.xlsx
KFG_Excel_Reader: Importing KH0033.xlsx
KFG_Excel_Reader: Importing KH0049.xlsx
KFG_Excel_Reader: Importing KH0057.xlsx
KFG_Excel_Reader: Importing KH0058.xlsx
KFG_Excel_Reader: Importing KH0067.xlsx
KFG_Excel_Reader: Importing KH0080.xlsx
KFG_Excel_Reader: Importing KH0081.xlsx
KFG_Excel_Reader: Importing KH0083.xlsx
KFG_Excel_Reader: Importing KH0197.xlsx
KFG_Excel_Reader: Importing KH0227.xlsx
KFG_Excel_Reader: Importing KH0267.xlsx
KFG_Excel_Reader: Importing KH0350.xlsx
KFG_Excel_Reader: Importing MM001.xlsx
KFG_Excel_Reader: Importing MM002.xlsx
KFG_Excel_Reader: Importing MM003.xlsx
KFG_Excel_Reader: Importing MM004.xlsx
KFG_Excel_Reader: Importing MM005.xlsx
KFG_Excel_Reader: Importing MM006_AN001.xlsx
KFG_Excel_Reader: Importing MM007_AN002.xlsx
KFG_Excel_Reader: Importing MM008.xlsx
KFG_Excel_Reader: Importing MM009.xlsx
KFG_Excel_Reader: Importing MM010.xlsx
KFG_Excel_Reader: Importing MM011.xlsx
KFG_Excel_Reader: Importing MM012.xlsx
KFG_Excel_Reader: Importing MM013.xlsx
KFG_Excel_Reader: Importing MM014.xlsx
KFG_Excel_Reader: Importing MM015.xlsx
KFG_Excel_Reader: Importing MM016.xlsx
KFG_Excel_Reader: Importing MM017.xlsx
KFG_Excel_Reader: Importing MM018.xlsx
KFG_Excel_Reader: Importing MM019.xlsx
KFG_Excel_Reader: Importing MM020.xlsx
KFG_Excel_Reader: Importing MM021.xlsx
KFG_Excel_Reader: Importing MM1086.xlsx
KFG_Excel_Reader: Importing QU001.xlsx
KFG_Excel_Reader: Importing QU002.xlsx
KFG_Excel_Reader: Importing QU003.xlsx
KFG_Excel_Reader: Importing QU004.xlsx
KFG_Excel_Reader: Importing QU005.xlsx
KFG_Excel_Reader: Importing QU006.xlsx
KFG_Excel_Reader: Importing QU007.xlsx
KFG_Excel_Reader: Importing QU008.xlsx
KFG_Excel_Reader: Importing QU009.xlsx
KFG_Excel_Reader: Importing QU010.xlsx
KFG_Excel_Reader: Importing QU011.xlsx
KFG_Excel_Reader: Importing QU012.xlsx
KFG_Excel_Reader: Importing QU013.xlsx
KFG_Excel_Reader: Importing QU014.xlsx
KFG_Excel_Reader: Importing QU015.xlsx
KFG_Excel_Reader: Importing QU016.xlsx
KFG_Excel_Reader: Importing QU017.xlsx
KFG_Excel_Reader: Importing QU018.xlsx
KFG_Excel_Reader: Importing QU019.xlsx
KFG_Excel_Reader: Importing QU020.xlsx
KFG_Excel_Reader: Importing QU021.xlsx
KFG_Excel_Reader: Importing QU022.xlsx
KFG_Excel_Reader: Importing SH001.xlsx
KFG_Excel_Reader: Importing SH002.xlsx
KFG_Excel_Reader: Importing UR001.xlsx
KFG_Excel_Reader: Importing UR004.xlsx
KFG_Excel_Reader: Importing UR017.xlsx
KFG_Excel_Reader: Importing UR039.xlsx
KFG_Excel_Reader: Importing UR044.xlsx
KFG_Excel_Reader: Importing UR050.xlsx
KFG_Excel_Reader: Importing UR052.xlsx
KFG_Excel_Reader: Importing UR054.xlsx
KFG_Excel_Reader: Importing UR055.xlsx
KFG_Excel_Reader: Importing UR088.xlsx
KFG_Excel_Reader: Importing UR089.xlsx
KFG_Excel_Reader: Importing UR110.xlsx
KFG_Excel_Reader: Importing UR112.xlsx
KFG_Excel_Reader: Importing UR144.xlsx
KFG_Excel_Reader: Importing UR155.xlsx
KFG_Excel_Reader: Importing UR165.xlsx
KFG_Excel_Reader: Importing UR167.xlsx
KFG_Excel_Reader: Importing UR190.xlsx
KFG_Excel_Reader: Importing UR193.xlsx
KFG_Excel_Reader: Importing UR196.xlsx
KFG_Excel_Reader: Importing UR206.xlsx
KFG_Excel_Reader: Importing UR209.xlsx
KFG_Excel_Reader: Importing UR221.xlsx
KFG_Excel_Reader: Importing UR251.xlsx
KFG_Excel_Reader: Importing UR252.xlsx
KFG_Excel_Reader: Importing UR253.xlsx
KFG_Excel_Reader: Importing UR254.xlsx
KFG_Excel_Reader: Importing UR255.xlsx
KFG_Excel_Reader: Importing UR257.xlsx
KFG_Excel_Reader: Importing UR258.xlsx
KFG_Excel_Reader: Importing UR259.xlsx
KFG_Excel_Reader: Importing UR260.xlsx
KFG_Excel_Reader: Importing UR261.xlsx
KFG_Excel_Reader: Importing UR262.xlsx
KFG_Excel_Reader: Importing UR263.xlsx
KFG_Excel_Reader: Importing UR266.xlsx
KFG_Excel_Reader: Importing UR267A.xlsx
KFG_Excel_Reader: Importing UR267B.xlsx
KFG_Excel_Reader: Importing UR268.xlsx
KFG_Excel_Reader: Importing UR269.xlsx
KFG_Excel_Reader: Importing UR270.xlsx
KFG_Excel_Reader: Importing UR271.xlsx
KFG_Excel_Reader: Importing UR272.xlsx
KFG_Excel_Reader: Importing UR273A.xlsx
KFG_Excel_Reader: Importing UR273B.xlsx
KFG_Excel_Reader: Importing UR274A.xlsx
KFG_Excel_Reader: Importing UR275.xlsx
KFG_Excel_Reader: Importing UR276.xlsx
KFG_Excel_Reader: Importing UR277.xlsx
KFG_Excel_Reader: Importing UR278.xlsx
KFG_Excel_Reader: Importing UR279.xlsx
KFG_Excel_Reader: Importing UR280.xlsx
KFG_Excel_Reader: Importing UR284.xlsx
KFG_Excel_Reader: Importing UR288.xlsx
KFG_Excel_Reader: Importing UR292A.xlsx
KFG_Excel_Reader: Importing UR293.xlsx
So as to not blotto hand-edited previous files, copy Text files by hand if needed!
Starting fetch:
0: 1000166
25: 1000334
50: 1000364
75: 1000054
100: 1000145
125: 1000165
150: 1000424
175: 1000450
200: 1000011
225: 1000332
250: 1000260
275: 1000337
300: 1000170
325: 1000053
350: 1000119
375: 1000290
400: 1000386
425: 1000496
450: 1000505
475: 1000534
500: 1000560
525: 6000010
550: 6000035
575: 6000060
600: 6000085
625: 6000110
650: 6000135
675: 6000160
That took 356 seconds --- 5.9 minutes
Made 653 khipus
(653, 49)
Made: khipu_summary: (653, 49)
0: 1000166
25: 1000334
50: 1000364
75: 1000054
100: 1000145
125: 1000165
150: 1000424
175: 1000450
200: 1000011
225: 1000332
250: 1000260
275: 1000337
300: 1000170
325: 1000053
350: 1000119
375: 1000290
400: 1000386
425: 1000496
450: 1000505
475: 1000534
500: 1000560
525: 6000010
550: 6000035
575: 6000060
600: 6000085
625: 6000110
650: 6000135
675: 6000160

8. Updated Museum Numbers and Provenance

The OKR/Open Khipu Repository has updated Museum numbers and Provenance for several khipus. Let’s use their inventory of updates to update the KFG database:

museum_num_updates = [('KH0120', 'VA24370(A)'), ('KH0121', 'VA24370(B)'), ('KH0142', 'VA63042(A)'), ('KH0143', 'VA63042(B)'),
('KH0189', 'VA16145(A)'), ('KH0190', 'VA16145(B)'), ('KH0193', 'VA37859(A)'), ('KH0194', 'VA37859(B)'), ('KH0197', 'VA66832'),
('KH0264', 'TM 4/5446'), ('KH0265', 'TM 4/5446'), ('KH0273', '32.30.30/53(A)'), ('KH0348', '1924.18.0001'), ('KH0349', '1931.37.0001'),
('KH0437', 'VA42597(A)'), ('KH0438', 'VA42597(B)'), ('KH0441', 'VA47114c(A)'), ('KH0442', 'VA47114c(B)'), ('KH0443', 'VA47114c(C)'),
('KH0447', 'VA16141(A)'), ('KH0448', 'VA16141(B)'), ('KH0450', 'VA42508(A)'), ('KH0451', 'VA42508(B)'), ('KH0458', 'VA47114b'),
('KH0463', 'VA44677a(A)'), ('KH0464', 'VA44677a(B)'), ('KH0468', 'VA63038(A)'), ('KH0469', 'VA63038(B)'), ('KH0478', 'VA42607(A)'),
('KH0479', 'VA42607(B)'), ('KH0480', 'VA42607(C)'), ('KH0481', 'VA42607(D)'), ('KH0484', 'VA42578i28'), ('KH0535', 'MSP 1389/RN 43370'),
('KH0558', 'MSP 1422/RN 43403'), ('KH0567', 'MNAAHP 4202'), ('KH0587', 'MNAAHP 30564'), ('KH0588', 'B397/T41299.22'), ('KH0589', 'B376/T41299.23'),
('KH0590', 'B388/T41299.24'), ('KH0591', 'B378/T41299.25'), ('KH0592', 'B377/T41299.26'), ('KH0593', 'B384/T41299.27'), ('KH0594', 'B372/T41299.28'),
('KH0595', 'B367/T41299.29'), ('KH0596', 'B366/T41299.30'), ('KH0597', 'B374/T41299.31'), ('KH0598', 'B375/T41299.32'),
('KH0599', 'B391/T41299.20'), ('KH0600', 'B369/T41299.33.A-B'), ('KH0601', 'B399/T41299.34'), ('KH0602', 'B373/T41299.18'),
('KH0603', 'B383&B383A/T41299.35.A-B'), ('KH0604', 'B395/T41299.36'), ('KH0605', 'B382/T41299.37'), ('KH0606', 'B371/T41299.38'),
('KH0405', '41.0/1550, B/3453A')]

provenance_updates = [('KH0085', 'Rancho San Juan, Ica Valley'), ('KH0086', 'Rancho San Juan, Ica Valley')]

khipu_df = kq.khipu_df
for (okr_name, new_museum_name) in museum_num_updates:
    kdb_name = kq.okr_name_to_kfg_name(okr_name)
    khipu_df.loc[khipu_df.investigator_num==kdb_name,'museum_num'] = new_museum_name
for (okr_name, new_provenance) in provenance_updates:
    kdb_name = kq.okr_name_to_kfg_name(okr_name)
    khipu_df.loc[khipu_df.investigator_num==kdb_name,'provenance'] = new_provenance

And a final complete rebuild.

%%capture cell_output.txt 
# Final Complete rebuild.
# Refresh in-memory databases

# Update khipu similar neighbors list

# Update khipu summary statistics, etc

all_khipus = [aKhipu for aKhipu in kamayuq.fetch_all_khipus(clean_build=False, run_silent=True).values()]

# Save the final KFG database in excel format
import kfg_excel_writer
[NbConvertApp] Converting notebook /Users/ashokkhosla/Desktop/Khipu/fieldguide/khipufieldguide/notebook/Khipu_EDA.ipynb to notebook
0.00s - Debugger warning: It seems that frozen modules are being used, which may
0.00s - make the debugger miss breakpoints. Please pass -Xfrozen_modules=off
0.00s - to python to disable frozen modules.
0.00s - Note: Debugging will proceed. Set PYDEVD_DISABLE_FILE_VALIDATION=1 to disable this validation.
0.00s - Debugger warning: It seems that frozen modules are being used, which may
0.00s - make the debugger miss breakpoints. Please pass -Xfrozen_modules=off
0.00s - to python to disable frozen modules.
0.00s - Note: Debugging will proceed. Set PYDEVD_DISABLE_FILE_VALIDATION=1 to disable this validation.
[NbConvertApp] Writing 5418155 bytes to /Users/ashokkhosla/Desktop/Khipu/fieldguide/khipufieldguide/notebook/Khipu_EDA.ipynb
[NbConvertApp] Converting notebook /Users/ashokkhosla/Desktop/Khipu/fieldguide/khipufieldguide/notebook/Khipu_EDA.ipynb to html
[NbConvertApp] Writing 5912501 bytes to /Users/ashokkhosla/Desktop/Khipu/fieldguide/khipufieldguide/notebook/Khipu_EDA.html
print(f"Final build - Made {len(all_khipus)} khipus")
Final build - Made 653 khipus

9. Sanity Check

A quick sanity check. The following test should return 14 khipus - if not a warning is printed

def is_zero_knot_khipu(aKhipu): return all([aCord.knotted_value()==0 for aCord in aKhipu[:,:]])
zero_knot_khipus = sorted([aKhipu.kfg_name() for aKhipu in all_khipus if is_zero_knot_khipu(aKhipu)])

verified_zero_knot_khipus = sorted(['AS025', 'HP025', 'HP026', 'HP028', 'HP048', 'QU001', 'UR070', 
                                    'UR071', 'UR082', 'UR103', 'UR158', 'UR179', 'UR185', 'UR216'])
if not (zero_knot_khipus == verified_zero_knot_khipus):
    print(f"Found {len(zero_knot_khipus)} zero knot khipus:")
    print("     '" + "'\n     '".join(zero_knot_khipus))
    print("TESTS PASSED -----------------------------------------")
TESTS PASSED -----------------------------------------

After considerable cross-referencing from various sources, programming and hand-work, all of the khipus referenced in the KDB collection that were lost were reconstructed, and an additional 50 khipus have been added! Time for a celebration! :-)

🎊 🎉 🎇 🥳 🎊 🎉 🎇 🥳 🎊 🎉 🎇 🥳

10. KFG Database Completeness

Now that we have a database of khipus, it’s a good time to review how complete the data is. For example, how many khipus have cords with knots, with known colors, etc. The current of data to inventory includes:

  • Number of Cords (including pendants, top cords, subsidiaries)
  • Number of Pendant Cords (not including top cords)
  • Number of Top Cords
  • Number of Subsidiaries
  • Cords With Known Colors
  • Cords with Known Cord Ply/Spin
  • Cords with Known Cord Attachments
  • Number of Knots
  • Khipus w/at Least 1 Knot (aka Zero Knot Khipus)
  • Knots with Known Twists
  • Long Knots with Known Axis Orientation (for Long Knots)

Let’s evaluate each of these in turn. We’ll look only at Pendants for now in interests of making the search easier. Similarly, we’ll only list the number of khipus that have at least some data.

10.1 Number of Cords (including pendants, top cords, subsidiaries)

(khipu_dict, all_khipus) = kamayuq.fetch_khipus()

num_corded_khipus = sum([aKhipu.num_cc_cords() > 0 for aKhipu in all_khipus])
corded_khipus = [aKhipu for aKhipu in all_khipus]
num_cords = sum([aKhipu.num_cc_cords() for aKhipu in all_khipus])
num_pendants = sum([aKhipu.num_pendant_cords() for aKhipu in all_khipus])

print(f"Number of khipus with cords = {num_corded_khipus}")
print(f"Number of all cords (including top cords and subsidiaries) = {num_cords}")
print(f"Number of pendant cords (including top cords) = {num_pendants}")
Number of khipus with cords = 653
Number of all cords (including top cords and subsidiaries) = 57831
Number of pendant cords (including top cords) = 41745

10.2 Number of Pendant Cords (not including top cords)

num_top_cords = sum([aKhipu.num_top_cords() for aKhipu in all_khipus])
num_down_pendants = num_pendants - num_top_cords
print(f"Number of pendant cords which are not top cords = {num_down_pendants}")
Number of pendant cords which are not top cords = 41370

10.3 Number of Top Cords

num_top_cords = sum([aKhipu.num_top_cords() for aKhipu in all_khipus])
print(f"Number of top cords (total) = {num_top_cords}")
Number of top cords (total) = 375

10.4 Number of Subsidiaries

num_subsidiaries = sum([aKhipu.num_subsidiary_cords() for aKhipu in all_khipus])
print(f"Number of subsidiary cords (total) = {num_subsidiaries}")
Number of subsidiary cords (total) = 16095

10.5 Khipus with No Colors

How many khipus have not measured their colors?

no_color_khipus = []
def satisfaction_condition(aCord):
    return aCord.longest_ascher_color()=="PK" or aCord.longest_ascher_color() == ""

for aKhipu in all_khipus:
    if all([satisfaction_condition(aCord) for aCord in aKhipu[:,:]]):
print(f"# of Khipus with No Known Color is  {ku.pct_kfg_khipus(len(no_color_khipus))}")
khipu_rep = ku.multiline(no_color_khipus, continuation_char="\n ")
print(f"No Color Khipus =\n{khipu_rep}")
# of Khipus with No Known Color is  3 (0%)
No Color Khipus =
['AS073', 'AS187', 'AS072']

10.6 Khipus with No Known Cord Ply/Spin

khipus_by_cord_ply = {}
for aKhipu in all_khipus:
    if num_khipu_cords := aKhipu.num_cc_cords():
        khipus_by_cord_ply[] = (aKhipu.num_s_cords() + aKhipu.num_z_cords())/num_khipu_cords
        khipus_by_cord_ply[] = 0
khipus_by_cord_ply = dict(sorted(khipus_by_cord_ply.items(), key=lambda x:x[1]))
zero_cord_ply_khipus = [key for key in khipus_by_cord_ply.keys() if khipus_by_cord_ply[key]==0 ]
num_zero_cord_ply_khipus = len(zero_cord_ply_khipus)
print(f"# of Khipus with No Known Cord Ply/Spin is {ku.pct_kfg_khipus(num_zero_cord_ply_khipus)}")

khipu_rep = ku.multiline(zero_cord_ply_khipus, line_length=80, continuation_char="\n ")
print(f"Zero Cord Ply Khipus =\n{khipu_rep}")
# of Khipus with No Known Cord Ply/Spin is 137 (21%)
Zero Cord Ply Khipus =
['AS010', 'AS011', 'AS012', 'AS013', 'AS014', 'AS015', 'AS016', 'AS017',
 'AS018', 'AS019', 'AS020', 'AS021', 'AS023', 'AS024', 'AS025', 'AS026A',
 'AS026B', 'AS027', 'AS028', 'AS029', 'AS035C', 'AS035D', 'AS036', 'AS037',
 'AS039', 'AS041', 'AS042', 'AS043', 'AS044', 'AS045', 'AS048', 'AS050', 'AS054',
 'AS055', 'AS059', 'AS060', 'AS061/MA036', 'AS062', 'AS063', 'AS063B', 'AS064',
 'AS065', 'AS065B', 'AS066', 'AS069', 'AS071', 'AS073', 'AS077', 'AS081',
 'AS082', 'AS083', 'AS085', 'AS089', 'AS090/N2', 'AS092', 'AS093', 'AS094',
 'AS101 - Part 1', 'AS101 - Part 2', 'AS110', 'AS111', 'AS112', 'AS115', 'AS122',
 'AS125', 'AS128', 'AS129', 'AS132', 'AS133', 'AS134', 'AS137', 'AS139', 'AS142',
 'AS153', 'AS155', 'AS156', 'AS157', 'AS158', 'AS159', 'AS160', 'AS164', 'AS168',
 'AS169', 'AS171', 'AS172', 'AS173', 'AS174', 'AS177', 'AS178', 'AS182',
 'AS182B', 'AS183', 'AS184', 'AS185', 'AS186', 'AS187', 'AS188', 'AS189',
 'AS201', 'AS202', 'AS203', 'AS204', 'AS205', 'AS206', 'AS207A', 'AS207B',
 'AS207C', 'AS209', 'AS210', 'AS211', 'AS212', 'AS213', 'AS214', 'AS215',
 'AS215F', 'AS35A', 'AS35B', 'HP052', 'UR1033A', 'UR1034', 'UR1040', 'UR1052',
 'UR1127', 'UR1141', 'AS001', 'AS002', 'AS003', 'AS004', 'AS005', 'AS006',
 'AS007', 'AS008', 'AS009', 'AS072', 'CM009', 'KH0058', 'KH0080']

10.7 Khipus with No Known Cord Attachment

khipus_by_cord_attachment = {}
for aKhipu in all_khipus:
    if num_khipu_cords := aKhipu.num_pendant_cords():
        khipus_by_cord_attachment[] = (aKhipu.num_top_cords() + aKhipu.num_recto_cords() + aKhipu.num_verso_cords())/num_khipu_cords
        khipus_by_cord_attachment[] = 0
khipus_by_cord_attachment = dict(sorted(khipus_by_cord_attachment.items(), key=lambda x:x[1]))
zero_cord_attachment_khipus = [key for key in khipus_by_cord_attachment.keys() if khipus_by_cord_attachment[key]==0 ]
num_zero_cord_attachment_khipus = len(zero_cord_ply_khipus)
print(f"# of Khipus with No Known Cord Attachment is {ku.pct_kfg_khipus(num_zero_cord_attachment_khipus)}")

khipu_rep = ku.multiline(zero_cord_attachment_khipus, line_length=80, continuation_char="\n ")
print(f"Zero Cord Attachment Khipus =\n{khipu_rep}")
# of Khipus with No Known Cord Attachment is 137 (21%)
Zero Cord Attachment Khipus =
['AS011', 'AS012', 'AS014', 'AS015', 'AS016', 'AS017', 'AS018', 'AS019',
 'AS020', 'AS023', 'AS024', 'AS025', 'AS026A', 'AS026B', 'AS027', 'AS028',
 'AS029', 'AS035C', 'AS035D', 'AS036', 'AS037', 'AS039', 'AS041', 'AS042',
 'AS043', 'AS045', 'AS048', 'AS050', 'AS054', 'AS055', 'AS059', 'AS060', 'AS062',
 'AS063', 'AS063B', 'AS064', 'AS065', 'AS065B', 'AS069', 'AS071', 'AS073',
 'AS077', 'AS081', 'AS082', 'AS083', 'AS085', 'AS089', 'AS090/N2', 'AS092',
 'AS093', 'AS094', 'AS101 - Part 1', 'AS101 - Part 2', 'AS110', 'AS111', 'AS112',
 'AS122', 'AS125', 'AS128', 'AS129', 'AS132', 'AS133', 'AS134', 'AS137', 'AS139',
 'AS142', 'AS153', 'AS155', 'AS156', 'AS157', 'AS158', 'AS159', 'AS160', 'AS164',
 'AS168', 'AS169', 'AS170', 'AS171', 'AS172', 'AS173', 'AS174', 'AS177', 'AS178',
 'AS182', 'AS182B', 'AS183', 'AS184', 'AS185', 'AS186', 'AS187', 'AS188',
 'AS189', 'AS201', 'AS202', 'AS203', 'AS204', 'AS205', 'AS206', 'AS207B',
 'AS207C', 'AS209', 'AS210', 'AS211', 'AS213', 'AS214', 'AS215F', 'AS35A',
 'AS35B', 'UR040', 'UR041', 'UR042', 'UR051', 'UR084', 'UR1033A', 'UR1034',
 'UR1040', 'UR1097', 'UR1098', 'UR1099', 'UR1100', 'UR1102', 'UR1103', 'UR1105',
 'UR1106', 'UR1107', 'UR1108', 'UR1109', 'UR1113', 'UR1114', 'UR1116', 'UR1117',
 'UR1118', 'UR1119', 'UR1120', 'UR1121', 'UR1123', 'UR1124', 'UR1124 Detail 1',
 'UR1126', 'UR1127', 'UR1130', 'UR1131', 'UR1135', 'UR1136', 'UR1138', 'UR1140',
 'UR1141', 'UR1143', 'UR1144', 'UR1145', 'UR1146', 'UR1147', 'UR1148', 'UR1149',
 'UR1150', 'UR1151', 'UR1152', 'UR1154', 'UR1161', 'UR1162A', 'UR1162B',
 'UR1163', 'UR1165', 'UR1166', 'UR1167', 'UR1175', 'UR1176', 'UR1179', 'UR1180',
 'UR127', 'UR129', 'UR132', 'UR215', 'AS005', 'AS008', 'AS009', 'CM009',
 'KH0058', 'UR050', 'UR052', 'UR054', 'UR055', 'UR110', 'UR112', 'UR144']

10.8 Number of Knots

num_knots = sum([aKhipu.num_knots() for aKhipu in all_khipus])
print(f"\nNumber of knots (total) for all Khipus = {num_knots}")

Number of knots (total) for all Khipus = 119165

10.9 Khipus with No Knots

How many Khipus have no knots?

## Zero Knot Khipus
def satisfaction_condition(aCord):
    return aCord.knotted_value()==0

zero_knot_khipus = []
for aKhipu in all_khipus:
    if all([satisfaction_condition(aCord) for aCord in aKhipu[:,:]]):
print(f"# of Khipus with No Knots is {ku.pct_kfg_khipus(len(zero_knot_khipus))}")

khipu_rep = ku.multiline(zero_knot_khipus, line_length=80, continuation_char="\n ")
print(f"Zero Knot Khipus =\n{khipu_rep}")
# of Khipus with No Knots is 14 (2%)
Zero Knot Khipus =
['AS025', 'HP025', 'HP026', 'HP028', 'UR070', 'UR071', 'UR082', 'UR103',
 'UR158', 'UR179', 'UR185', 'UR216', 'HP048', 'QU001']

10.10 Khipus with no Knot Twists

How many Khipus have knots with unrecorded twists?

khipus_by_knot_twist = {}
for aKhipu in all_khipus:
    if num_khipu_knots := aKhipu.num_knots():
        khipus_by_knot_twist[] = (aKhipu.num_s_knots() + aKhipu.num_z_knots())/num_khipu_knots
        khipus_by_knot_twist[] = 0
khipus_by_knot_twist = dict(sorted(khipus_by_knot_twist.items(), key=lambda x:x[1]))
zero_knot_twist_khipus = [key for key in khipus_by_knot_twist.keys() if khipus_by_knot_twist[key]==0 ]
num_zero_knot_twist_khipus = len(zero_knot_twist_khipus)
print(f"# of Khipus with No Known Knot Twist is {ku.pct_kfg_khipus(num_zero_knot_twist_khipus)}")

khipu_rep = ku.multiline(zero_knot_twist_khipus, line_length=80, continuation_char="\n ")
print(f"No Known Knot Twist Khipus =\n{khipu_rep}")
# of Khipus with No Known Knot Twist is 140 (22%)
No Known Knot Twist Khipus =
['AS010', 'AS011', 'AS012', 'AS013', 'AS014', 'AS015', 'AS016', 'AS017',
 'AS018', 'AS019', 'AS020', 'AS021', 'AS023', 'AS024', 'AS025', 'AS026A',
 'AS026B', 'AS027', 'AS028', 'AS029', 'AS035C', 'AS035D', 'AS036', 'AS037',
 'AS039', 'AS041', 'AS042', 'AS043', 'AS044', 'AS045', 'AS048', 'AS050', 'AS054',
 'AS055', 'AS059', 'AS060', 'AS061/MA036', 'AS062', 'AS063', 'AS063B', 'AS064',
 'AS065', 'AS065B', 'AS066', 'AS069', 'AS071', 'AS073', 'AS077', 'AS081',
 'AS082', 'AS083', 'AS085', 'AS089', 'AS090/N2', 'AS092', 'AS093', 'AS094',
 'AS101 - Part 1', 'AS101 - Part 2', 'AS110', 'AS111', 'AS112', 'AS115', 'AS122',
 'AS125', 'AS128', 'AS129', 'AS132', 'AS133', 'AS134', 'AS137', 'AS139', 'AS142',
 'AS153', 'AS155', 'AS156', 'AS157', 'AS158', 'AS159', 'AS160', 'AS164', 'AS168',
 'AS169', 'AS170', 'AS171', 'AS172', 'AS173', 'AS174', 'AS177', 'AS178', 'AS182',
 'AS182B', 'AS183', 'AS184', 'AS185', 'AS186', 'AS187', 'AS188', 'AS189',
 'AS201', 'AS202', 'AS203', 'AS204', 'AS205', 'AS206', 'AS207A', 'AS207B',
 'AS207C', 'AS209', 'AS210', 'AS211', 'AS212', 'AS213', 'AS214', 'AS215',
 'AS215F', 'AS35A', 'AS35B', 'HP025', 'HP026', 'HP028', 'UR070', 'UR071',
 'UR082', 'UR158', 'AS001', 'AS002', 'AS003', 'AS004', 'AS005', 'AS006', 'AS007',
 'AS008', 'AS009', 'AS072', 'CM009', 'HP048', 'KH0058', 'KH0080', 'QU001']

10.11 Khipus with Long Knots with Known Axis Orientation

How many Khipus record axis orientation of long knots?

import khipu_cord

CSV_dir = f"{kq.project_directory()}/data/CSV"
axis_df = pd.read_csv(f"{CSV_dir}/knot_clean.csv")
def has_old_orientation(x): return isinstance(x, str) and x.startswith("AX")
long_knot_direction_mask = [has_old_orientation(orientation) for orientation in axis_df.axis_orientation.values ]
cord_ids = list(axis_df[long_knot_direction_mask].cord_id.values)
KFG_khipus = list(set([khipu_cord.fetch_cord(cord_id).khipu_name for cord_id in cord_ids]))

khipus_with_axis_orientation = sorted(list(set(KFG_khipus)))
num_zero_axis_orientation_khipus = len(all_khipus) - len(khipus_with_axis_orientation)
print(f"# of Khipus with No Known Long Knot Axis_orientation is {ku.pct_kfg_khipus(num_zero_axis_orientation_khipus)}")

khipu_rep = ku.multiline(khipus_with_axis_orientation, line_length=80, continuation_char="\n ")
print(f"KNOWN Axis-Orientation Khipus =\n{khipu_rep}")
# of Khipus with No Known Long Knot Axis_orientation is 424 (65%)
KNOWN Axis-Orientation Khipus =
['AS067/MA029', 'AS074', 'AS075', 'AS076', 'AS078', 'AS079', 'AS080', 'AS093',
 'AS191', 'AS192', 'AS193', 'AS194', 'AS195', 'AS196', 'AS197', 'AS198', 'AS199',
 'AS200', 'HP001', 'HP002', 'HP003', 'HP004', 'HP005', 'HP006', 'HP007', 'HP008',
 'HP010', 'HP011', 'HP012', 'HP013', 'HP014', 'HP015', 'HP016', 'HP017', 'HP018',
 'HP019', 'HP020', 'HP021', 'HP022', 'HP023', 'HP024', 'HP027', 'HP029', 'HP030',
 'HP031', 'HP032', 'LL01', 'MM001', 'MM002', 'MM003', 'MM004', 'MM005',
 'MM006/AN001', 'MM007/AN002', 'MM008', 'MM009', 'MM010', 'MM011', 'MM012',
 'MM013', 'MM014', 'MM015', 'MM016', 'MM017', 'MM018', 'MM019', 'MM020', 'MM021',
 'MM1086', 'UR002', 'UR003', 'UR005', 'UR006', 'UR007', 'UR008', 'UR009',
 'UR010', 'UR011', 'UR012', 'UR013', 'UR014', 'UR015', 'UR016', 'UR018', 'UR019',
 'UR020', 'UR021', 'UR022', 'UR034', 'UR051', 'UR060', 'UR061', 'UR062', 'UR063',
 'UR064', 'UR066', 'UR067', 'UR069', 'UR072', 'UR074', 'UR075', 'UR076', 'UR077',
 'UR078', 'UR079', 'UR080', 'UR081', 'UR085', 'UR086', 'UR087', 'UR090', 'UR091',
 'UR092', 'UR093', 'UR094', 'UR095', 'UR096', 'UR097', 'UR098', 'UR099', 'UR100',
 'UR101', 'UR102', 'UR1031', 'UR104', 'UR105', 'UR1051', 'UR1052', 'UR1053',
 'UR1057', 'UR1058', 'UR106', 'UR107', 'UR108', 'UR1084', 'UR109', 'UR1104',
 'UR111', 'UR120', 'UR128', 'UR129', 'UR131A', 'UR131B', 'UR132', 'UR143',
 'UR152', 'UR168', 'UR169', 'UR170', 'UR171', 'UR172', 'UR173', 'UR174', 'UR175',
 'UR176', 'UR177', 'UR182', 'UR183', 'UR184', 'UR185', 'UR186', 'UR187', 'UR188',
 'UR189', 'UR191', 'UR192', 'UR194', 'UR195', 'UR197', 'UR198', 'UR199', 'UR200',
 'UR201', 'UR202', 'UR203', 'UR204', 'UR205', 'UR207', 'UR208', 'UR210', 'UR212',
 'UR213', 'UR214', 'UR215', 'UR216', 'UR217', 'UR218', 'UR220', 'UR221', 'UR222',
 'UR223', 'UR225', 'UR226', 'UR228', 'UR229', 'UR230', 'UR231', 'UR232', 'UR233',
 'UR234', 'UR235', 'UR236', 'UR237', 'UR238', 'UR239', 'UR240', 'UR241', 'UR242',
 'UR243', 'UR244', 'UR245', 'UR246', 'UR247', 'UR248', 'UR249', 'UR250', 'UR256',
 'UR264', 'UR265', 'UR274B', 'UR282', 'UR283', 'UR285', 'UR286', 'UR287',
 'UR289', 'UR290', 'UR291A', 'UR294']

11. Nudo Desnudo

So how do you represent an unknotted knot? The authors of the Harvard database decided to make knot-clusters with zero knots. Did this break the computer code till fixed. Absolutely! :-) Zen knots.