Skip to content

Tabulate outputs for JSON outputs #50

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Oct 17, 2023
Merged

Tabulate outputs for JSON outputs #50

merged 2 commits into from
Oct 17, 2023

Conversation

orangetin
Copy link
Member

Outputs in human readable form instead of a raw json dump. Original format can be printed by passing --raw.

For example, finetune list-events used to look like:

{
    "data": [
        {
            "object": "fine-tune-event",
            "created_at": "2023-10-03T21:42:09.864Z",
            "level": "",
            "message": "Fine tune request created",
            "type": "JOB_PENDING",
            "param_count": 0,
            "token_count": 0,
            "checkpoint_path": "",
            "model_path": "",
            "training_offset": 0,
            "hash": ""
        },
        {
            "object": "fine-tune-event",
            "created_at": "2023-10-03T22:06:01.887Z",
            "level": "info",
            "message": "Training started at Tue Oct  3 15:06:01 PDT 2023",
            "type": "JOB_START",
            "param_count": 0,
            "token_count": 0,
            "checkpoint_path": "",
            "model_path": "",
            "training_offset": 0,
            "hash": "-5404845294489111409"
        },
        {
            "object": "fine-tune-event",
            "created_at": "2023-10-03T22:08:05.251Z",
            "level": "info",
            "message": "Model data downloaded for togethercomputer/llama-2-7b at Tue Oct  3 15:08:04 PDT 2023",
            "type": "MODEL_DOWNLOAD_COMPLETE",
            "param_count": 0,
            "token_count": 0,
            "checkpoint_path": "",
            "model_path": "",
            "training_offset": 0,
            "hash": "5813098827847397758"
        },
        {
            "object": "fine-tune-event",
            "created_at": "2023-10-03T22:08:07.352Z",
            "level": "info",
            "message": "Training data downloaded for togethercomputer/llama-2-7b at Tue Oct  3 15:08:06 PDT 2023",
            "type": "TRAINING_DATA_DOWNLOAD_COMPLETE",
            "param_count": 0,
            "token_count": 0,
            "checkpoint_path": "",
            "model_path": "",
            "training_offset": 0,
            "hash": "-2082647884415193343"
        },
        {
            "object": "fine-tune-event",
            "created_at": "2023-10-03T22:08:36.339Z",
            "level": "info",
            "message": "Training started for model /work/job-ft-d31970b7-b94a-432c-990e-6ddd54cc1f08/model",
            "type": "TRAINING_START",
            "param_count": 6738415616,
            "token_count": 35568,
            "checkpoint_path": "",
            "model_path": "",
            "training_offset": 0,
            "hash": "3575589923974681476"
        },
        {
            "object": "fine-tune-event",
            "created_at": "2023-10-03T22:09:08.319Z",
            "level": "info",
            "message": "Epoch completed, at step 3",
            "type": "EPOCH_COMPLETE",
            "param_count": 0,
            "token_count": 0,
            "checkpoint_path": "",
            "model_path": "",
            "training_offset": 0,
            "hash": "3036370196950450130"
        },
        {
            "object": "fine-tune-event",
            "created_at": "2023-10-03T22:09:30.613Z",
            "level": "info",
            "message": "Epoch completed, at step 6",
            "type": "EPOCH_COMPLETE",
            "param_count": 0,
            "token_count": 0,
            "checkpoint_path": "",
            "model_path": "",
            "training_offset": 0,
            "hash": "8231958398042126836"
        },
        {
            "object": "fine-tune-event",
            "created_at": "2023-10-03T22:09:53.179Z",
            "level": "info",
            "message": "Epoch completed, at step 9",
            "type": "EPOCH_COMPLETE",
            "param_count": 0,
            "token_count": 0,
            "checkpoint_path": "",
            "model_path": "",
            "training_offset": 0,
            "hash": "5673844678110964808"
        },
        {
            "object": "fine-tune-event",
            "created_at": "2023-10-03T22:10:01.059Z",
            "level": "info",
            "message": "Epoch completed, at step 10",
            "type": "EPOCH_COMPLETE",
            "param_count": 0,
            "token_count": 0,
            "checkpoint_path": "",
            "model_path": "",
            "training_offset": 0,
            "hash": "-909161687365331404"
        },
        {
            "object": "fine-tune-event",
            "created_at": "2023-10-03T22:10:21.614Z",
            "level": "info",
            "message": "Training completed for togethercomputer/llama-2-7b at Tue Oct  3 15:10:21 PDT 2023",
            "type": "TRAINING_COMPLETE",
            "param_count": 0,
            "token_count": 0,
            "checkpoint_path": "",
            "model_path": "",
            "training_offset": 0,
            "hash": "-6701380572541858137"
        },
        {
            "object": "fine-tune-event",
            "created_at": "2023-10-03T22:15:22.89Z",
            "level": "info",
            "message": "Job finished at Tue Oct  3 15:15:22 PDT 2023",
            "type": "JOB_COMPLETE",
            "param_count": 0,
            "token_count": 0,
            "checkpoint_path": "",
            "model_path": "s3://together-dev/finetune/646ce94a82981bf51adde909/orangetin/llama-2-7b-2023-10-03-21-42-09/ft-d31970b7-b94a-432c-990e-6ddd54cc1f08-2023-10-03-15-11-09",
            "training_offset": 0,
            "hash": "-8040418296897197605"
        }
    ],
    "object": "list"
}

but now looks like:

+----+------------------------------------------------------------------------------------------+---------------------------------+----------------------+
|    | Message                                                                                  | Type                            | Hash                 |
+====+==========================================================================================+=================================+======================+
|  0 | Fine tune request created                                                                | JOB_PENDING                     |                      |
+----+------------------------------------------------------------------------------------------+---------------------------------+----------------------+
|  1 | Training started at Tue Oct  3 15:06:01 PDT 2023                                         | JOB_START                       | -5404845294489111409 |
+----+------------------------------------------------------------------------------------------+---------------------------------+----------------------+
|  2 | Model data downloaded for togethercomputer/llama-2-7b at Tue Oct  3 15:08:04 PDT 2023    | MODEL_DOWNLOAD_COMPLETE         | 5813098827847397758  |
+----+------------------------------------------------------------------------------------------+---------------------------------+----------------------+
|  3 | Training data downloaded for togethercomputer/llama-2-7b at Tue Oct  3 15:08:06 PDT 2023 | TRAINING_DATA_DOWNLOAD_COMPLETE | -2082647884415193343 |
+----+------------------------------------------------------------------------------------------+---------------------------------+----------------------+
|  4 | Training started for model /work/job-ft-d31970b7-b94a-432c-990e-6ddd54cc1f08/model       | TRAINING_START                  | 3575589923974681476  |
+----+------------------------------------------------------------------------------------------+---------------------------------+----------------------+
|  5 | Epoch completed, at step 3                                                               | EPOCH_COMPLETE                  | 3036370196950450130  |
+----+------------------------------------------------------------------------------------------+---------------------------------+----------------------+
|  6 | Epoch completed, at step 6                                                               | EPOCH_COMPLETE                  | 8231958398042126836  |
+----+------------------------------------------------------------------------------------------+---------------------------------+----------------------+
|  7 | Epoch completed, at step 9                                                               | EPOCH_COMPLETE                  | 5673844678110964808  |
+----+------------------------------------------------------------------------------------------+---------------------------------+----------------------+
|  8 | Epoch completed, at step 10                                                              | EPOCH_COMPLETE                  | -909161687365331404  |
+----+------------------------------------------------------------------------------------------+---------------------------------+----------------------+
|  9 | Training completed for togethercomputer/llama-2-7b at Tue Oct  3 15:10:21 PDT 2023       | TRAINING_COMPLETE               | -6701380572541858137 |
+----+------------------------------------------------------------------------------------------+---------------------------------+----------------------+
| 10 | Job finished at Tue Oct  3 15:15:22 PDT 2023                                             | JOB_COMPLETE                    | -8040418296897197605 |
+----+------------------------------------------------------------------------------------------+---------------------------------+----------------------+

Outputs in human readable form instead of a raw json dump
@orangetin orangetin requested review from clam004 and azahed98 October 17, 2023 00:07
@azahed98
Copy link
Contributor

Awesome change! LGTM

@orangetin orangetin merged commit bb33337 into main Oct 17, 2023
@orangetin orangetin deleted the orangetin/tabulate branch October 17, 2023 15:37
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants