You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When having full control over the application stack. Batching multiple requests into one message, indicates that there is no order dependency on the encapsulated messages. Caching database queries in this scenario is a good optimisation. E.g. DataLoader
I propose adding a .shared mechanism. In the way .stream is exposed
Example
// create as normalconstChatSchema=newmongoose.schema({text: String});constChat=moduleStream('Chat',ChatSchema);// create a shared refconstChatWithCachedQueries=Chat.shared()// use shared refworkTodo({Chat:ChatWithCachedQueries})functionasyncworkTodo(modules){// this find call will be Cached, so if the same query is call on this shared module. // It will return the same value instead of hitting the database againconstchats=awaitmodules.Chat.find({})// ... do something with chats}// reuse shared refworkSomeOtherWork({Chat:ChatWithCachedQueries})
Question: What if the DB object is change in App?.. = this change will be reflected to all instances
This code would allow each function to be given its own unique version of the document that they could overwrite without affecting any of the other references
const docDB = { a:1, aa:10, b:{c:5} }
function getMeARef(docDB){
const changes = { }
const user = { }
for (const key in docDB) {
let val = null
if("object" === typeof docDB[key]){
val = getMeARef(docDB[key])
}
Object.defineProperty(user, key, { //<- This object is called a "property descriptor".
enumerable: true,
configurable: true,
//Alternatively, use: `get() {}`
get: function() {
if(changes.hasOwnProperty(key)){
return changes[key]
}
return val || docDB[key];
},
//Alternatively, use: `set(newValue) {}`
set: function(newValue) {
if(changes[key] !== docDB[key]){
return changes[key] = newValue;
}
}
});
}
return user
}
var genRef = getMeARef.bind(null,docDB)
var x = genRef()
var z = genRef()
console.log(JSON.stringify({x,z,docDB}))
x.a = 8
docDB.aa = 20
console.log(JSON.stringify({x,z,docDB}))
Still need a lot of work. e.g.
Array & complex data types like Date.
If I key is added or removed from the document
handling functions white population
-> However this approach would be a gateway into allowing the subset of changes as an update <-
When having full control over the application stack. Batching multiple requests into one message, indicates that there is no order dependency on the encapsulated messages. Caching database queries in this scenario is a good optimisation. E.g. DataLoader
I propose adding a
.shared
mechanism. In the way.stream
is exposedExample
Question: What if the DB object is change in App?.. = this change will be reflected to all instances
The text was updated successfully, but these errors were encountered: