dimitrov-adrian / directus-extension-searchsync Goto Github PK
View Code? Open in Web Editor NEWSimple Directus 9 extension that sync content with remote search engine.
License: MIT License
Simple Directus 9 extension that sync content with remote search engine.
License: MIT License
how can i install this extension to directus docker container?
Is it possible to sync one collection to multiple Meilisearch indexes with different filters for each index?
Example - collection "products" sync to index "product-en" and "product-jp"
I have a tag field which is saved as an array field in the collection.
I can't access it from fiels: ["tags"]
Hey there,
is it possible to add a float / timestamp as a UNIX Timestamp?
I need it like this: https://docs.meilisearch.com/learn/advanced/working_with_dates.html#preparing-your-documents
I tried to transform a directus timestamp (string) to a UNIX timestamp before adding it to Meilisearch.
transform: (item, { flattenObject, striptags }) => {
return {
...
start_date: new Date(Date.parse(item.start_date)) / 1000,
end_date: new Date(Date.parse(item.end_date)) / 1000
};
},
After reindexing nothing was there anymore. My goal is to filter the date in meilisearch.
Thx a lot
Torben
Have an installation step?
Hi there.
Just wondering if there's any chance we could have the transform as optionally async. I'm trying to get text extraction going, but that relies on waiting for the extraction to be ready so that I have the content to send to my index. Would that be possible ?
The examples show the image.id
format:
"fields": ["image.id",
but this doesn't seem to work.
This does work:
"fields": ["image",
but it only produces the GUID, none of the other properties.
which means this is impossible:
"fields": ["image.id", "image.width", "image.height"
am I missing something?
Not so much of an issue but a question or feature request:
If I have a M2O or M2M field and update the values in those collections, the changes are not reflected on the collection I am indexing.
Following your example, a simplified config with "products" as the indexed collection with a M2O field called "category":
"collections": {
"products": {
"fields": ["title", "category.title"]
}
}
If I go to the collection holding the categories and change the "title", those changes do not trigger an update on all the indexes that relate to it.
Is it possible to keep them on sync?
BTW, thanks for the extension, great job!
Hi,
I get error below. Extension is not working for me.
23:38:55 ✨ Loaded extensions: directus-extension-searchsync, masked-interface
23:38:55 ⚠️ Couldn't register hook "directus-extension-searchsync"
23:38:55 ⚠️ Unexpected token '.'
{
"name": "directus-playground",
"version": "1.0.0",
"description": "",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"keywords": [],
"author": "",
"license": "MIT",
"dependencies": {
"directus": "^9.0.0-rc.95",
"pg": "^8.7.1",
"directus-extension-searchsync": "dimitrov-adrian/directus-extension-searchsync#v1.0.0-rc.94"
}
}
Hello, Please consider adding support for OpenSearch as a more open-source friendly alternative to Elasticsearch.
Hi,
I get error below after update directus to 9.1.2
Hi,
how could I pass search configurations like searchableAttributes to an index?
https://docs.meilisearch.com/learn/core_concepts/indexes.html#index-settings
In order to be able to filter the directus query for a collection by nested fields (although it also works for count, limit, sort... operations), it could be added a new item on the collections
field of the configuration that allowed such behavior.
I get this error: unknown command 'extension:searchsync'
What am I missing ?
Hello again,
On my site, I have reindexOnStart enabled, but i've noticed that the call to readByQuery only returns a maximum of 100 items. If you have more items than that in the collection, then some items will not be indexed.
The function either needs to remove the limit, or work in batches until complete.
I'm trying to combine multiple database collections into one Algolia index. I set the "collections..indexName" to the same value, but as each collection runs, it clears the current index. So the first collection listed in my .searchsyncrc.json file will import to the index, but then gets cleared out once the second collection starts being indexed.
Also, semi-related, I'm wondering how to customize the objectID Algolia uses? Right now, it seems to default to the database collection ID, but as all collections go into the same index, the items overwrite each other as many share the same objectID. Ideally, mine would be something like "" (product1,product2,product3,etc). I've tried using "transform" to create my own "objectID" field, but Algolia either overrides it with just the id, or ignores it entirely.
Now have to manually run the command npx directus extension:searchsync index
to sync to meilisearch
Is there has automatic way to do this? just like meilisearch plugins in strapi
@dimitrov-adrian
Hi sorry for bothering you again.
Reinstalled the extension following the new readme, but i get a strange behavior.
Changed the host to http://localhost:9200/directus
but only one collection gets saved in elasticsearch index.
directus-extension-searchsync Cannot update viaggi/1. Error: Request failed with status code 400
directus-extension-searchsync Cannot update viaggi/2. Error: Request failed with status code 400
directus-extension-searchsync Cannot update viaggi/3. Error: Request failed with status code 400
directus-extension-searchsync Cannot update viaggi/4. Error: Request failed with status code 400
directus-extension-searchsync Cannot update viaggi/5. Error: Request failed with status code 400
directus-extension-searchsync Cannot update viaggi/6. Error: Request failed with status code 400
This is what I get if I query elasticsearch:
GET http://localhost:9200/directus/_search
{
"took": 37,
"timed_out": false,
"_shards": {
"total": 1,
"successful": 1,
"skipped": 0,
"failed": 0
},
"hits": {
"total": {
"value": 6,
"relation": "eq"
},
"max_score": 1.0,
"hits": [
{
"_index": "directus",
"_type": "tappe",
"_id": "1",
"_score": 1.0,
"_source": {
"nome": "Cavalieri",
"tipo": "piazza"
}
},
{
"_index": "directus",
"_type": "tappe",
"_id": "2",
"_score": 1.0,
"_source": {
"nome": "Massaciuccoli",
"tipo": "lago"
}
},
{
"_index": "directus",
"_type": "tappe",
"_id": "3",
"_score": 1.0,
"_source": {
"nome": "Villa Flora",
"tipo": "piazza"
}
},
{
"_index": "directus",
"_type": "tappe",
"_id": "4",
"_score": 1.0,
"_source": {
"nome": "test1",
"tipo": "Monumento"
}
},
{
"_index": "directus",
"_type": "tappe",
"_id": "5",
"_score": 1.0,
"_source": {
"nome": "fewad",
"tipo": "Monumento"
}
},
{
"_index": "directus",
"_type": "tappe",
"_id": "6",
"_score": 1.0,
"_source": {
"nome": "tappaprova",
"tipo": "Parco"
}
}
]
}
}
Only the collection 'tappe' gets saved.
The collection 'viaggi' has a one to many relationship with 'tappe'.
Anyway, it happened the opposite aswell, the 'viaggi' collection got saved, but not 'tappe.
the configuration file is the same except the host.
module.exports = {
server: {
type: "elasticsearch",
host: "http://localhost:9200/directus",
},
reindexOnStart: true,
collections: {
viaggi: {
filter: {
},
fields: [
"nome",
"durata_h",
"difficolta",
],
transform: formatter,
},
tappe: {
filter: {
},
fields: ["nome", "tipo"],
transform: formatter,
},
},
};
function formatter(value, { flattenObject }) {
value = flattenObject(value);
return value;
}
Originally posted by @jfrancarioInera in #7 (comment)
When running the CLI command to reindex, the first record in the Algolia index is being updated over and over again for every Item in the Collection. I think it's because the ObjectID is set as "undefined", so Algolia is finding an existing record with that ObjectID and updating it, rather than adding a new record.
I was originally just configuring search-sync using a json file, passing just Item fields. In this case, I think Algolia was auto-generating the ObjectID. Since it was always "undefined", I tried using a js-based config file so I could pass the item.id as the Algolia ObjectID - no difference.
SearchSync v1.0.2
Directus 9.2.1
This line is wrong: https://github.com/dimitrov-adrian/directus-extension-searchsync/blob/main/lib/indexers/meilisearch.js#L47
Current:
await axios.delete(`${config.host}/indexes/${collection}`, axiosConfig);
It should be:
await axios.delete(`${config.host}/indexes/${collection}/documents`, axiosConfig);
Instead of deleting the index documents it's deleting the entire index. 😢
Hi!
Trying to add this to my project but getting this output:
yarn add dimitrov-adrian/directus-extension-searchsync
yarn add v1.22.18
[1/4] Resolving packages...
error Couldn't find the binary git
Any thoughts?
Getting this error when trying to post the initial collections to elasticsearch.
SEARCHSYNC {
action: 'UPDATE',
collection: 'tappe',
id: 5,
error: Error: Request failed with status code 400
at createError (/home/james/directus-test/extensions/hooks/directus-extension-searchsync-main/node_modules/axios/lib/core/createError.js:16:15)
at settle (/home/james/directus-test/extensions/hooks/directus-extension-searchsync-main/node_modules/axios/lib/core/settle.js:17:12)
at IncomingMessage.handleStreamEnd (/home/james/directus-test/extensions/hooks/directus-extension-searchsync-main/node_modules/axios/lib/adapters/http.js:260:11)
at IncomingMessage.emit (events.js:387:35)
at endReadableNT (internal/streams/readable.js:1317:12)
at processTicksAndRejections (internal/process/task_queues.js:82:21) {
config: {
url: 'http://localhost:9200/directus2/_search/tappe/_doc/5',
method: 'post',
data: '{"nome":"fewad","tipo":"Monumento"}',
headers: [Object],
transformRequest: [Array],
transformResponse: [Array],
timeout: 0,
adapter: [Function: httpAdapter],
xsrfCookieName: 'XSRF-TOKEN',
xsrfHeaderName: 'X-XSRF-TOKEN',
maxContentLength: -1,
maxBodyLength: -1,
validateStatus: [Function: validateStatus]
},
request: ClientRequest {
_events: [Object: null prototype],
_eventsCount: 7,
_maxListeners: undefined,
outputData: [],
outputSize: 0,
writable: true,
destroyed: false,
_last: true,
chunkedEncoding: false,
shouldKeepAlive: false,
_defaultKeepAlive: true,
useChunkedEncodingByDefault: true,
sendDate: false,
_removedConnection: false,
_removedContLen: false,
_removedTE: false,
_contentLength: null,
_hasBody: true,
_trailer: '',
finished: true,
_headerSent: true,
socket: [Socket],
_header: 'POST /directus2/_search/tappe/_doc/5 HTTP/1.1\r\n' +
'Accept: application/json, text/plain, */*\r\n' +
'Content-Type: application/json\r\n' +
'User-Agent: axios/0.21.1\r\n' +
'Content-Length: 35\r\n' +
'Host: localhost:9200\r\n' +
'Connection: close\r\n' +
'\r\n',
_keepAliveTimeout: 0,
_onPendingData: [Function: noopPendingOutput],
agent: [Agent],
socketPath: undefined,
method: 'POST',
maxHeaderSize: undefined,
insecureHTTPParser: undefined,
path: '/directus2/_search/tappe/_doc/5',
_ended: true,
res: [IncomingMessage],
aborted: false,
timeoutCb: null,
upgradeOrConnect: false,
parser: null,
maxHeadersCount: null,
reusedSocket: false,
host: 'localhost',
protocol: 'http:',
_redirectable: [Writable],
[Symbol(kCapture)]: false,
[Symbol(kNeedDrain)]: false,
[Symbol(corked)]: 0,
[Symbol(kOutHeaders)]: [Object: null prototype]
},
response: {
status: 400,
statusText: 'Bad Request',
headers: [Object],
config: [Object],
request: [ClientRequest],
data: [Object]
},
isAxiosError: true,
toJSON: [Function: toJSON]
}
}
configfile:
module.exports = {
server: {
type: "elasticsearch",
host: "http://localhost:9200/directus2/_search",
},
reindexOnStart: true,
collections: {
viaggi: {
filter: {
},
fields: [
"nome",
"durata_h",
"difficolta",
],
transform: formatter,
},
tappe: {
filter: {
},
fields: ["nome", "tipo"],
transform: formatter,
},
},
};
function formatter(value, { flattenObject }) {
value = flattenObject(value);
return value;
}
Btw is this extension valid for production purposes?
When trying to index a collection with a relationship field like thumbnail.filename_disk
it doesn't index that field.
I've only seen the id working, if you add thumbnail
as the field it will index just the id. But sometimes you need other data, like in the case of the author of a post you might want to index the name, but author.name
won't work.
Tested using Directus v9.10.0
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.