$ bsub -Is -R "span[hosts=1]" -gpu "num=8:j_exclusive=yes" ilab model chat --model /u/gsamu/.local/share/instructlab/checkpoints/hf_format/samples_4435
Job <1146> is submitted to default queue <interactive>.
<<Waiting for dispatch ...>>
<<Starting on p1-r01-n2>>
INFO 2025-01-02 15:06:07,600 instructlab.model.backends.vllm:105: Trying to connect to model server at http://127.0.0.1:8000/v1
INFO 2025-01-02 15:06:08,876 instructlab.model.backends.vllm:308: vLLM starting up on pid 3744375 at http://127.0.0.1:41531/v1
INFO 2025-01-02 15:06:08,876 instructlab.model.backends.vllm:114: Starting a temporary vLLM server at http://127.0.0.1:41531/v1
INFO 2025-01-02 15:06:08,876 instructlab.model.backends.vllm:129: Waiting for the vLLM server to start at http://127.0.0.1:41531/v1, this might take a moment... Attempt: 1/120
INFO 2025-01-02 15:06:12,244 instructlab.model.backends.vllm:129: Waiting for the vLLM server to start at http://127.0.0.1:41531/v1, this might take a moment... Attempt: 2/120
INFO 2025-01-02 15:06:15,614 instructlab.model.backends.vllm:129: Waiting for the vLLM server to start at http://127.0.0.1:41531/v1, this might take a moment... Attempt: 3/120
INFO 2025-01-02 15:06:18,801 instructlab.model.backends.vllm:129: Waiting for the vLLM server to start at http://127.0.0.1:41531/v1, this might take a moment... Attempt: 4/120
INFO 2025-01-02 15:06:21,952 instructlab.model.backends.vllm:129: Waiting for the vLLM server to start at http://127.0.0.1:41531/v1, this might take a moment... Attempt: 5/120
INFO 2025-01-02 15:06:25,391 instructlab.model.backends.vllm:129: Waiting for the vLLM server to start at http://127.0.0.1:41531/v1, this might take a moment... Attempt: 6/120
INFO 2025-01-02 15:06:28,638 instructlab.model.backends.vllm:129: Waiting for the vLLM server to start at http://127.0.0.1:41531/v1, this might take a moment... Attempt: 7/120
INFO 2025-01-02 15:06:32,103 instructlab.model.backends.vllm:129: Waiting for the vLLM server to start at http://127.0.0.1:41531/v1, this might take a moment... Attempt: 8/120
INFO 2025-01-02 15:06:35,296 instructlab.model.backends.vllm:129: Waiting for the vLLM server to start at http://127.0.0.1:41531/v1, this might take a moment... Attempt: 9/120
INFO 2025-01-02 15:06:38,616 instructlab.model.backends.vllm:129: Waiting for the vLLM server to start at http://127.0.0.1:41531/v1, this might take a moment... Attempt: 10/120
INFO 2025-01-02 15:06:42,015 instructlab.model.backends.vllm:129: Waiting for the vLLM server to start at http://127.0.0.1:41531/v1, this might take a moment... Attempt: 11/120
INFO 2025-01-02 15:06:45,435 instructlab.model.backends.vllm:129: Waiting for the vLLM server to start at http://127.0.0.1:41531/v1, this might take a moment... Attempt: 12/120
INFO 2025-01-02 15:06:48,679 instructlab.model.backends.vllm:129: Waiting for the vLLM server to start at http://127.0.0.1:41531/v1, this might take a moment... Attempt: 13/120
INFO 2025-01-02 15:06:52,025 instructlab.model.backends.vllm:129: Waiting for the vLLM server to start at http://127.0.0.1:41531/v1, this might take a moment... Attempt: 14/120
INFO 2025-01-02 15:06:55,317 instructlab.model.backends.vllm:129: Waiting for the vLLM server to start at http://127.0.0.1:41531/v1, this might take a moment... Attempt: 15/120
INFO 2025-01-02 15:06:58,604 instructlab.model.backends.vllm:129: Waiting for the vLLM server to start at http://127.0.0.1:41531/v1, this might take a moment... Attempt: 16/120
INFO 2025-01-02 15:07:01,927 instructlab.model.backends.vllm:129: Waiting for the vLLM server to start at http://127.0.0.1:41531/v1, this might take a moment... Attempt: 17/120
INFO 2025-01-02 15:07:05,287 instructlab.model.backends.vllm:129: Waiting for the vLLM server to start at http://127.0.0.1:41531/v1, this might take a moment... Attempt: 18/120
INFO 2025-01-02 15:07:08,763 instructlab.model.backends.vllm:129: Waiting for the vLLM server to start at http://127.0.0.1:41531/v1, this might take a moment... Attempt: 19/120
INFO 2025-01-02 15:07:12,131 instructlab.model.backends.vllm:129: Waiting for the vLLM server to start at http://127.0.0.1:41531/v1, this might take a moment... Attempt: 20/120
INFO 2025-01-02 15:07:15,476 instructlab.model.backends.vllm:129: Waiting for the vLLM server to start at http://127.0.0.1:41531/v1, this might take a moment... Attempt: 21/120
INFO 2025-01-02 15:07:18,881 instructlab.model.backends.vllm:129: Waiting for the vLLM server to start at http://127.0.0.1:41531/v1, this might take a moment... Attempt: 22/120
INFO 2025-01-02 15:07:22,203 instructlab.model.backends.vllm:129: Waiting for the vLLM server to start at http://127.0.0.1:41531/v1, this might take a moment... Attempt: 23/120
INFO 2025-01-02 15:07:25,599 instructlab.model.backends.vllm:129: Waiting for the vLLM server to start at http://127.0.0.1:41531/v1, this might take a moment... Attempt: 24/120
INFO 2025-01-02 15:07:28,991 instructlab.model.backends.vllm:129: Waiting for the vLLM server to start at http://127.0.0.1:41531/v1, this might take a moment... Attempt: 25/120
INFO 2025-01-02 15:07:32,234 instructlab.model.backends.vllm:129: Waiting for the vLLM server to start at http://127.0.0.1:41531/v1, this might take a moment... Attempt: 26/120
INFO 2025-01-02 15:07:35,714 instructlab.model.backends.vllm:129: Waiting for the vLLM server to start at http://127.0.0.1:41531/v1, this might take a moment... Attempt: 27/120
INFO 2025-01-02 15:07:38,974 instructlab.model.backends.vllm:129: Waiting for the vLLM server to start at http://127.0.0.1:41531/v1, this might take a moment... Attempt: 28/120
INFO 2025-01-02 15:07:42,265 instructlab.model.backends.vllm:129: Waiting for the vLLM server to start at http://127.0.0.1:41531/v1, this might take a moment... Attempt: 29/120
INFO 2025-01-02 15:07:45,582 instructlab.model.backends.vllm:129: Waiting for the vLLM server to start at http://127.0.0.1:41531/v1, this might take a moment... Attempt: 30/120
INFO 2025-01-02 15:07:45,586 instructlab.model.backends.vllm:136: vLLM engine successfully started at http://127.0.0.1:41531/v1
╭────────────────────────────────────────────────────────────── system ──────────────────────────────────────────────────────────────╮
│ Welcome to InstructLab Chat w/ SAMPLES_4435 (type /h for help) │
╰────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
>>> Tell me everything you know about Swifties. [S][default]
╭─────────────────────────────────────────────────────────── samples_4435 ───────────────────────────────────────────────────────────╮
│ Swifties are the fandom of the American singer-songwriter Taylor Swift. │
│ Regarded by journalists as one of the largest, most devoted, and influential fan bases, Swifties are known for their high levels │
│ of participation, creativity, community, fanaticism, and cultural impact on the music industry and popular culture. They are a │
│ subject of widespread coverage in the mainstream media. │
│ │
│ Critics have opined that Swift has redefined artist-fan relationships by establishing an intimate connection with Swifties. She │
│ has frequently engaged with, helped, credited, and prioritized her fans, who have offered unprecedented support and interest in │
│ her works irrespective of her wavering reception in the media. They continued to support Swift through her genre transitions, │
│ unexpected artistic pivots, and her highly publicized controversies such as the 2019 masters dispute, while instigating the │
│ political scrutiny of Ticketmaster that led to implementation of various laws and stimulated economic growth with the Eras Tour. │
│ Swift's releases, promotional efforts, and fashion have garnered attention for incorporating Easter eggs and clues that are
...
...
|