correctly I have a code- that take any update on "Source DB" and copy it to "Destination DB" i think I need make it that with single read call from the "Source DB" and copy in parallel to few "Destination DB" stream = None while True: try: collection_in = source_db.get_collection(collection_name) collection_out = destination_db.get_collection(collection_name) with collection_in.watch(full_document='updateLookup') as stream: for change in stream: oper_type = change['operationType'] logger.debug(f"{oper_type} received: {collection_name}") if oper_type in ignored_ops: logger.debug(f"{oper_type} operation ignored") continue if oper_type == "insert": callback = db_insert_callback elif oper_type == "replace": # db_update_callback(change) callback = db_update_callback elif oper_type == "delete": callback = db_remove_callback callback(change, collection_out=collection_out) except Exception as ex: logger.exception(ex) finally: if stream: stream.close() example to insert def db_insert_callback(insert_change, collection_out): doc = insert_change['fullDocument'] # logger.info(insert_change) try: collection_out.insert(doc) except Exception as ex: logger.exception(ex) p
I find solution for this may be it can help if anyone faceing same problem like i did. Here the source.txt contains some data and dest.txt is the file where the data is written. For text files it is done in this way: f=open('source.txt','r') g=open('dest.txt','w') r=f.read() r=r.split() for x in r: g.write(x+' ') f.close() g.close() So, for databases probably using threading connect to the databases with each thread and do the transfers in those thread. yesornogenerator