I-FlexGen, injini yokusebenzisa i-AI bots ku-GPU eyodwa

I-FlexGen

I-FlexGen iyinjini eyakhiwe ngenjongo yokunciphisa izidingo zensiza yokucabanga yamamodeli amakhulu olimi zibe i-GPU eyodwa.

Izindaba zisanda kukhishwa ukuthi iqembu labacwaningi kusuka eStanford University, University of California eBerkeley, ETH Zurich, Graduate School of Economics, Carnegie Mellon University, kanye I-Yandex ne-Meta, ishicilele ikhodi yomthombo ye un injini yokusebenzisa amamodeli olimi amakhulu ezinhlelweni ezinezinsiza ezinomkhawulo.

ngegama lekhodi «FlexGen», iphrojekthi ehlose ukunciphisa kakhulu izidingo izinsiza zemisebenzi yokukhomba ye-LLM. Kuthunyelwe ku-GitHub, i-FlexGen idinga kuphela i-Python ne-PyTorch kodwa ikakhulukazi ingasetshenziswa ne-GPU eyodwa efana ne-NVIDIA Tesla T4 noma i-GeForce RTX 3090.

Isibonelo, injini inikeza ikhono lokudala ukusebenza okusikhumbuza i-ChatGPT ne-Copilot isebenzisa imodeli ye-OPT-175B eqeqeshwe kusengaphambili ehlanganisa amapharamitha ayizigidi eziyizinkulungwane ezingu-175 kukhompuyutha evamile enekhadi lemifanekiso yokudlala le-NVIDIA RTX3090 elifakwe inkumbulo engu-24 GB yevidiyo.

Kushiwo ukuthi amamodeli (LLM) asekela ukusebenza kwamathuluzi afana ne-ChatGPT ne-Copilot. Lawa amamodeli amakhulu asebenzisa izigidigidi zamapharamitha futhi aqeqeshwe ngenani elikhulu ledatha.

Izidingo eziphezulu zekhompuyutha nezenkumbulo zemisebenzi ye-LLM ye-inference ngokuvamile zidinga ukusetshenziswa kwama-accelerator asezingeni eliphezulu.

Siyajabula ukuthi umphakathi ujabule ngempela nge-FlexGen. Nokho, umsebenzi wethu usalungiswa futhi awukakalungeli ukukhululwa/ukumenyezelwa esidlangalaleni. Kusukela ekuphenduleni kwasekuqaleni ngalo msebenzi, saqaphela ukuthi izinguqulo zangaphambili zale README kanye nedokhumenti yethu bekungacacile ngenjongo ye-FlexGen. Lona umzamo wokuqala wokunciphisa izidingo zensiza yama-LLM, kodwa futhi unemikhawulo eminingi futhi akuhlosiwe ukuvala izimo zokusetshenziswa uma izinsiza ezanele zikhona.

Ukuqagela kwe-LLM kuyinqubo lapho imodeli yolimi isetshenziswa khona ukwenza izibikezelo mayelana nombhalo wokufakwayo: kuhilela ukusebenzisa imodeli yolimi, njengemodeli yokukhiqiza efana ne-GPT (Generative Pretrained Transformer), ukwenza izibikezelo mayelana nalokho okungenzeka kakhulu. ukwenzeka. inikezwe njengempendulo ngemva kombhalo othile oqoshiwe.

Mayelana ne-FlexGen

Iphakheji ihlanganisa iskripthi sesampula sokudala ama-bots. okuvumela umsebenzisi landa enye yezinhlobo zolimi ezitholakala esidlangalaleni bese uqala ukuxoxa ngaso leso sikhathi.

Njengesisekelo, kuhlongozwa ukuthi kusetshenziswe imodeli yolimi enkulu eshicilelwe yi-Facebook, eqeqeshwe kumaqoqo e-BookCorpus (izincwadi eziyizinkulungwane ezingu-10), i-CC-Stories, i-Pile (OpenSubtitles, i-Wikipedia, i-DM Mathematics, i-HackerNews, njll.), Pushshift.io (kusekelwe kudatha ye-Reddit)) kanye ne-CCNewsV2 (ingobo yomlando yezindaba).

Imodeli ihlanganisa amathokheni ayizigidi eziyizinkulungwane ezingu-180 (800 GB yedatha). Kuthathe izinsuku ezingu-33 ukusebenzisa iqoqo elinama-992 NVIDIA A100 80 GB GPUs ukuqeqesha imodeli.

Isebenzisa i-OPT-175B kusistimu ene-NVIDIA T4 GPU eyodwa (16 GB), injini ye-FlexGen ibonise ukusebenza okusheshayo okufika ku-100x kunezixazululo ezinikezwe ngaphambilini, okwenza ukusetshenziswa kwemodeli yolimi enkulu kufinyeleleke kakhudlwana futhi kuzivumela ukuthi zisebenze kumasistimu angenawo ama-accelerator akhethekile.

Ngesikhathi esifanayo, i-FlexGen ingakwazi ukukala ukuze ihambisane nezibalo lapho kukhona ama-GPU amaningi. Ukunciphisa usayizi wemodeli, isikimu sokucindezela ipharamitha eyengeziwe kanye nendlela yokugcinwa kwesikhashana yemodeli kusetshenziswa.

Okwamanje, I-FlexGen isekela kuphela amamodeli olimi e-OPT, kodwa esikhathini esizayo, abathuthukisi baphinde bathembise ukwengeza ukusekelwa kwe-BLOOM (amapharamitha ayizigidi eziyizinkulungwane ezingu-176, isekela izilimi ezingu-46 nezilimi zokuhlela ezingu-13), i-CodeGen (ingakwazi ukukhiqiza ikhodi ngezilimi zokuhlela ezingu-22), kanye ne-GLM.

Ekugcineni kufanelekile ukusho ukuthi ikhodi ibhalwe ku-Python, isebenzisa uhlaka lwe-PyTorch futhi isatshalaliswa ngaphansi kwelayisensi ye-Apache 2.0.

Ngokuba Unentshisekelo yokufunda okwengeziwe ngayo, ungabheka imininingwane Kulesi sixhumanisi esilandelayo.


Shiya umbono wakho

Ikheli lakho le ngeke ishicilelwe. Ezidingekayo ibhalwe nge *

*

*

  1. Ubhekele imininingwane: Miguel Ángel Gatón
  2. Inhloso yedatha: Lawula Ugaxekile, ukuphathwa kwamazwana.
  3. Ukusemthethweni: Imvume yakho
  4. Ukuxhumana kwemininingwane: Imininingwane ngeke idluliselwe kubantu besithathu ngaphandle kwesibopho esisemthethweni.
  5. Isitoreji sedatha: Idatabase ebanjwe yi-Occentus Networks (EU)
  6. Amalungelo: Nganoma yisiphi isikhathi ungakhawulela, uthole futhi ususe imininingwane yakho.