+\hSB4dZddlmZddlmZddlmZmZmZm Z m Z m Z m Z m Z mZddlmZmZddlmZddlmZddlmZdd lmZmZmZdd lmZmZmZm Z m!Z!dd l"m#Z#dd l$m%Z%m&Z&m'Z'erdd l(m)Z)ddl*m+Z+ddl,m-Z-dZ.Gdde e'Z/Gdde/e'Z0y)z4CommandCursor class to iterate over command results.) annotations)deque) TYPE_CHECKINGAny AsyncIteratorGenericMappingNoReturnOptionalSequenceUnion) CodecOptions&_convert_raw_document_lists_to_streams)_csot)_ConnectionManager)_CURSOR_CLOSED_ERRORS)ConnectionFailureInvalidOperationOperationFailure)_CursorAddress_GetMore_OpMsg_OpReply_RawBatchGetMore)PinnedResponse)_Address _DocumentOut _DocumentType)AsyncClientSession)AsyncCollection)AsyncConnectionFceZdZdZeZ d ddZd dZd!dZd"dZ e d#dZ d$dZ d% d&d Z e d"d Ze d'd Ze d(d Ze d)d Zd*dZd dZd dZd dZd dZd+dZd'dZd,dZd-dZd-dZd.dZd/d0dZd1dZd2dZd3dZ e!jDd/d4dZ#y)5AsyncCommandCursorz7An asynchronous cursor / iterator over command cursors.Nc bd|_||_|d|_t|d|_|j d|_||_||_||_ |jjjjj|_||_||_|jdk(|_||_|j"r|j'd|vr |d|_n|j*|_|j-|t/|t0s|t3dt5|yy)zCreate a new command cursor.Nid firstBatchpostBatchResumeTokenrnsz2max_await_time_ms must be an integer or None, not ) _sock_mgr _collection_idr_dataget_postbatchresumetoken_address _batch_size_max_await_time_msdatabaseclientoptionstimeout_timeout_session_explicit_session_killed_comment _end_session_ns full_name batch_size isinstanceint TypeErrortype) self collection cursor_infoaddressr>max_await_time_mssessionexplicit_sessioncomments f/root/niggaflix-v3/playground/venv/lib/python3.12/site-packages/pymongo/asynchronous/command_cursor.py__init__zAsyncCommandCursor.__init__;s #;Et$;|45 BM// "C "  %"3((1188@@HH  !1xx1}  <<     ; "4(DH!++DH  #+S16G6SDTJ[E\D]^ 7T1c$|jyN) _die_no_lockrCs rK__del__zAsyncCommandCursor.__del__es rMct|tstdt||dkr t d|dk(xrdxs||_|S)aLimits the number of documents returned in one batch. Each batch requires a round trip to the server. It can be adjusted to optimize performance and limit data transfer. .. note:: batch_size can not override MongoDB's internal limits on the amount of data it will return to the client in a single batch (i.e if you set batch size to 1,000,000,000, MongoDB will currently only return 4-16MB of results per batch). Raises :exc:`TypeError` if `batch_size` is not an integer. Raises :exc:`ValueError` if `batch_size` is less than ``0``. :param batch_size: The size of each batch of results requested. z#batch_size must be an integer, not rzbatch_size must be >= 0)r?r@rArB ValueErrorr0)rCr>s rKr>zAsyncCommandCursor.batch_sizehsU*c*A$zBRASTU U >67 7%?0q>J rMc2t|jdkDS)z^Returns `True` if the cursor has documents remaining from the previous batch. r)lenr,rQs rK _has_nextzAsyncCommandCursor._has_nexts4::""rMc|jS)zlRetrieve the postBatchResumeToken from the response to a changeStream aggregate or getMore. )r.rQs rK_post_batch_resume_tokenz+AsyncCommandCursor._post_batch_resume_tokens )))rMc8K|jjj}|j|jsy|j sL|j t|d}|jdk(r|jd{y||_yy7w)NFr) r*r2r3_should_pin_cursorr7r) pin_cursorrr+close)rCconnr3conn_mgrs rK_maybe_pin_connectionz(AsyncCommandCursor._maybe_pin_connections}!!**11((7 ~~ OO )$6Hxx1}nn&&&!) 'sBB B Bc*|j||||SrO)unpack_response)rCresponse cursor_id codec_options user_fieldslegacy_responses rK_unpack_responsez#AsyncCommandCursor._unpack_responses'' =+__rMc\tt|jxs |j S)a Does this cursor have the potential to return more data? Even if :attr:`alive` is ``True``, :meth:`next` can raise :exc:`StopIteration`. Best to use a for loop:: async for doc in collection.aggregate(pipeline): print(doc) .. note:: :attr:`alive` can be True while iterating a cursor from a failed server. In this case :attr:`alive` will return False after :meth:`next` fails to retrieve the next batch of results from the server. )boolrXr,r9rQs rKalivezAsyncCommandCursor.alives#C O9DLL(8::rMc|jS)zReturns the id of the cursor.)r+rQs rKrfzAsyncCommandCursor.cursor_ids xxrMc|jS)zUThe (host, port) of the server used, or None. .. versionadded:: 3.0 )r/rQs rKrFzAsyncCommandCursor.addresss }}rMc4|jr |jSy)zThe cursor's :class:`~pymongo.asynchronous.client_session.AsyncClientSession`, or None. .. versionadded:: 3.6 N)r8r7rQs rKrHzAsyncCommandCursor.sessions  ! !== rMc|j}d|_|jr@|s>|j}|jJt|j|j}||fSd}d}||fS)NTr)r9r+r/rr<)rCalready_killedrfrFs rK_prepare_to_diez"AsyncCommandCursor._prepare_to_diesl 88NI==, ,,$T]]DHH=G '!!IG'!!rMc|j\}}|jjjj |||j |j |j|jsd|_d|_y)z,Closes this cursor without acquiring a lock.N)rsr*r2r3_cleanup_cursor_no_lockr)r7r8rCrfrFs rKrPzAsyncCommandCursor._die_no_locksg!113 7 !!((@@ w t?U?U %% DMrMcK|j\}}|jjjj |||j |j |jd{|jsd|_d|_y7w)zCloses this cursor.N)rsr*r2r3_cleanup_cursor_lockr)r7r8rvs rK _die_lockzAsyncCommandCursor._die_locks~!113 7''..CC   NN MM  " "    %% DM sA(B *B + B cz|jr/|js"|jjd|_yyyrO)r7r8_end_implicit_sessionrQs rKr;zAsyncCommandCursor._end_sessions/ ==!7!7 MM / / 1 DM"8=rMc@K|jd{y7w)z$Explicitly close / kill this cursor.N)ryrQs rKr_zAsyncCommandCursor.closesnn cK|jjj} |j||j|j d{}t|t r1|j"s%t%|j&|j(|_|j*r8|j,dd}|d}|j/d|_|d|_nC|j,}t|j4t6sJ|j4j8|_|j2dk(r|jd{t;||_y7#t $rU}|jtvrd|_ |jr|j|jd{7d}~wt$r"d|_ |jd{7t$r|jd{7wxYw7ĭw) z/Send a getmore message and handle the response.)rFNTrcursor nextBatchr'r%)r*r2r3_run_operationrjr/rcoderr9r5rPr_r Exceptionr?rr)rr` more_to_come from_commanddocsr-r.r+datarrfrr,)rC operationr3reexcr documentss rK _send_messagez AsyncCommandCursor._send_messages!!**11 #22400$--3H. h />>!3HMM8CXCX!Y  ]]1%h/F{+I)/4J)KD &d|DH IhmmX6 66}}..DH 88q=**,  9% M xx00# {{!!# jjl""   DL**,    **,    $ sl!H+EEEC*H>H?HE H A F0)F,*F00&HG!H8G;9HHcKt|js |jrt|jS|jr|jj dd\}}|j j|j}|j|j|||j|j|j j||j|j jj|j |j"d|j$ d{n|j'd{t|jS727w)aRefreshes the cursor with more data from the server. Returns the length of self._data after refresh. Will exit early if self._data is already non-empty. Raises OperationFailure when the cursor cannot be refreshed due to an error on the query. .rTFN)rXr,r9r+r<splitr*_read_preference_forrHr_getmore_classr0rgr7r2r3r1r)r:ry)rCdbnamecollname read_prefs rK_refreshzAsyncCommandCursor._refresh&s tzz?dlltzz? " 88#xx~~c15 FH((==dllKI$$##$$HH$$22MM$$--44++NNMM   ".." " "4::' " #s$D#E%E&E>E?EEc|SrOrQs rK __aiter__zAsyncCommandCursor.__aiter__Hs rMcK|jr*|jdd{}||S|jr*t7w)zAdvance the cursor.TN)rm _try_nextStopAsyncIteration)rCdocs rKnextzAsyncCommandCursor.nextKs@jjt,,C jj ! -s!A?AAc>K|jd{S7wrO)rrQs rK __anext__zAsyncCommandCursor.__anext__UsYY[   s cKt|js&|js|r|jd{t|jr|jj Sy74w)z>> await cursor.to_list() Or, so read at most n items from the cursor:: >>> await cursor.to_list(n) If the cursor is empty or has no more results, an empty list will be returned. .. versionadded:: 4.9 rTz'to_list() length must be greater than 0Nr)r?r@rVrmrrX)rClengthres remainings rKto_listzAsyncCommandCursor.to_lists $& fc "vzFG Gjj))#y999  !"SX- > jj :sAA9A7 +A95A9rNNFNrDzAsyncCollection[_DocumentType]rEzMapping[str, Any]rFOptional[_Address]r>r@rG Optional[int]rHOptional[AsyncClientSession]rIrlrJrreturnNone)rr)r>r@r!AsyncCommandCursor[_DocumentType])rrl)rOptional[Mapping[str, Any]])r`r!rrNF) reUnion[_OpReply, _OpMsg]rfrrgzCodecOptions[Mapping[str, Any]]rhrrirlrzSequence[_DocumentOut])rr@)rr)rr)rz$tuple[int, Optional[_CursorAddress]])rrrr)rzAsyncIterator[_DocumentType])rr)rrlrOptional[_DocumentType]rO)rlistrrrrl)rr)rr)rrrrrrrr)rrrzlist[_DocumentType])$__name__ __module__ __qualname____doc__rrrLrRr>rYpropertyr[rbrjrmrfrFrHrsrPryr;r_rrrrrrrrrrrapplyrrrMrKr#r#6sAN+/04!&(2('($ (  ( ) (.((( (T.# ** *&48 % `)`!`7 ` 1 `  ` `;;  " ! *&X D!! ;$ [[rMr#c~eZdZeZ d dfd Z d ddZddZxZS) AsyncRawBatchCommandCursorc X|jdrJt | ||||||||y)a^Create a new cursor / iterator over raw batches of BSON data. Should not be called directly by application developers - see :meth:`~pymongo.asynchronous.collection.AsyncCollection.aggregate_raw_batches` instead. .. seealso:: The MongoDB documentation on `cursors `_. r&N)r-superrL) rCrDrErFr>rGrHrIrJ __class__s rKrLz#AsyncRawBatchCommandCursor.__init__s<&??<000          rMcL|j||}|st|d|S)N)rhr) raw_responser)rCrerfrgrhrirs rKrjz+AsyncRawBatchCommandCursor._unpack_responses0 ,,YK,P  3<? CrMctd)Nz5Cannot call __getitem__ on AsyncRawBatchCommandCursor)r)rCindexs rK __getitem__z&AsyncRawBatchCommandCursor.__getitem__sVWWrMrrr) rerrfrrgrrhrrirlrzlist[Mapping[str, Any]])rr@rr ) rrrrrrLrjr __classcell__)rs@rKrrs%N+/04!& 2 ' $    )  .     H48 % ) ! $  1    ! XrMrN)1r __future__r collectionsrtypingrrrrr r r r r bsonrrpymongorpymongo.asynchronous.cursorrpymongo.cursor_sharedrpymongo.errorsrrrpymongo.messagerrrrrpymongo.responserpymongo.typingsrrr#pymongo.asynchronous.client_sessionrpymongo.asynchronous.collectionr pymongo.asynchronous.poolr!_IS_SYNCr#rrrMrKrs;"   F:7PP,AAF?9 m/m` 2X!3M!B2XrM