robconery / moebius Goto Github PK
View Code? Open in Web Editor NEWA functional query tool for Elixir
License: MIT License
A functional query tool for Elixir
License: MIT License
So there isn't a match on filter when the :where
isn't empty. Currently moebius assumes that all filters that have a criteria are set at the same time.
# This works
db(:users)
|> filter(first_name: "Ron", last_name: "Swanson")
|> Users.Database.run
# And this works
db(:users)
|> filter(first_name: "Ron")
|> filter("last_name = $2", ["Swanson"])
|> Users.Database.run
# This does not work
db(:users)
|> filter(first_name: "Ron")
|> filter(last_name: "Swanson")
|> Users.Database.run
The last scenario produces the following error:
** (FunctionClauseError) no function clause matching in Moebius.QueryFilter.filter/2
(moebius) lib/moebius/query_filter.ex:88: Moebius.QueryFilter.filter(%Moebius.QueryCommand{columns: nil, conn: nil, error: nil, group_by: nil, join: [""], limit: "", offset: "", order: "", params: ["Ron"], pid: nil, sql: nil, table_name: "users", type: :select, vals: nil, where: " where first_name = $1", where_columns: [:first_name]}, [lastname: "Swanson"])
This is an issue for composing queries. In the case of adding filters based on a value being present the error will be raised. This should be an easy fix and I'll take a look at it later.
In order to work around this I'm piping SQL directly to the database like this:
def update_movie(release_date, movie) do
"UPDATE movies SET release_date = '#{format_date(release_date)}' WHERE original_title = '#{movie.original_title}';"
|> LetterboxdCal.Db.run
end
It was then pointed out to me that this is susceptible to SQL injection as the data is user supplied.
Does Moebius do anything to sanitise user data or is up to me do handle that? I looked through the code base and I couldn't see anything which looked like it handled that.
Thanks.
When trying to update array column:
Moebius.Query.db(:users)
|> Moebius.Query.filter(email: "[email protected]")
|> Moebius.Query.update(roles: ["admin"])
|> App.Db.query
got exception:
** (ArgumentError) Postgrex expected a list that can be encoded/cast to type "_varchar", got "admin". Please make sure the value you are passing matches the definition in your table or in your query or convert the value accordingly.
(postgrex) lib/postgrex/extensions/array.ex:11: Postgrex.Extensions.Array.encode/4
(postgrex) lib/postgrex/query.ex:135: DBConnection.Query.Postgrex.Query.do_encode/4
(postgrex) lib/postgrex/query.ex:72: DBConnection.Query.Postgrex.Query.encode/3
(db_connection) lib/db_connection.ex:859: DBConnection.describe_execute/4
(db_connection) lib/db_connection.ex:987: DBConnection.run_begin/3
(db_connection) lib/db_connection.ex:921: DBConnection.run_meter/3
(db_connection) lib/db_connection.ex:463: DBConnection.prepare_execute/4
(postgrex) lib/postgrex.ex:125: Postgrex.query/4
I'd like to build a simple up/down migrations thing for moebius. I have a prototype working. I know @robconery does not likes migrations, well I do and probably many others too. I find it useful when working with a team of devs, mostly. So I intend it to be separate library.
I figured so far that I can put my migrations into scripts/migrations/some_migration_timestamp_up.sql etc. This is working and I ma using
Moebius.Query.sql_file(:"migrations/#{timestamp}_up") |> Moebius.Db.run
to run the migration. Then I update database table with list of applied migrations if the above succeeds. When I rollback migration I do the other way around: remove applied migration from the lsit.
The problem is: I am limited to one SQL command per file. If I try to add multiple ones, I get:
{:error, "cannot insert multiple commands into a prepared statement"}
I don't even need it to be prepared statement.
Is there a way to dump raw SQL to Moebius.Db somehow?
Right now it's an empty array, which is silly.
I saw you were executing out to the psql
command since you couldn't handle more than one connection with the postgrex per a comment.
You might be able to take advantage of Poolboy to be able to setup a connection pool instead of having to rely on an external system command.
I'm trying to use the the transaction API:
transaction fn(tx) ->
"insert into x(text) values('foo');" |> Moebius.Db.run(tx)
"select * from x limit 1;" |> Moebius.Db.run(tx)
end
But this is causing the underlying gen_server to crash:
** (exit) exited in: :gen_server.call(My.Db, {:checkout, #Reference<0.0.2.364>, true, 5000}, 5000)
For note, if I run the same code without the transaction fun/1 wrapper, it seems to execute just fine:
"insert into x(text) values('foo');" |> Moebius.Db.run
"select * from x limit 1;" |> Moebius.Db.run
Is there something obvious that I'm doing wrong?
This looks like a very useful library. Is there any interest in bringing it up to date?
I'm glad to help but not quite sure where to start. I get this after updating libs and running tests:
Generated moebius app
** (EXIT from #PID<0.93.0>) shutdown: failed to start child: TestDb
** (EXIT) an exception was raised:
** (UndefinedFunctionError) function DBConnection.Poolboy.child_spec/1 is undefined (module DBConnection.Poolboy is not available)
DBConnection.Poolboy.child_spec({Postgrex.Protocol, [username: "chase", types: Moebius.PostgrexTypes, name: TestDb, url: "postgres://localhost/meebuss", database: "meebuss", hostname: "localhost", pool: DBConnection.Poolboy]})
(db_connection 2.4.1) lib/db_connection.ex:444: DBConnection.start_link/2
(stdlib 3.16.1) supervisor.erl:414: :supervisor.do_start_child_i/3
(stdlib 3.16.1) supervisor.erl:400: :supervisor.do_start_child/2
(stdlib 3.16.1) supervisor.erl:384: anonymous fn/3 in :supervisor.start_children/2
(stdlib 3.16.1) supervisor.erl:1242: :supervisor.children_map/4
(stdlib 3.16.1) supervisor.erl:350: :supervisor.init_children/2
(stdlib 3.16.1) gen_server.erl:423: :gen_server.init_it/2
Hey, I'm trying to execute simple query inside iex, but it hangs.
This is my config:
config :moebius,
connection: [
hostname: "localhost",
database: "database_dev",
pool_mod: DBConnection.Poolboy
],
scripts: "priv/queries"
Supervision:
def start(_type, _args) do
import Supervisor.Spec, warn: false
# Define workers and child supervisors to be supervised
children = [
supervisor(Repo, []),
worker(Moebius.Db, [Moebius.get_connection])
] ++ cowboy_worker
opts = [strategy: :one_for_one, name: Taped.Supervisor]
Supervisor.start_link(children, opts)
end
and query:
$ iex -S mix
iex(1)> import Moebius.DocumentQuery
iex(2)> result = db(:videos) |> Moebius.Db.first
# and iex hangs
I'm trying to use Moebius with pool, but it won't work.
First of all, pool_mod
does nothing, as I read from postgrex docs it should be a pool
key.
When I change config to use pool
key, i go weird exception:
** (exit) an exception was raised:
** (UndefinedFunctionError) function :invalid_message.exception/1 is undefined (module :invalid_message is not available)
:invalid_message.exception([])
(db_connection) lib/db_connection.ex:925: DBConnection.checkout/2
(db_connection) lib/db_connection.ex:741: DBConnection.run/3
(db_connection) lib/db_connection.ex:1132: DBConnection.run_meter/3
(db_connection) lib/db_connection.ex:584: DBConnection.prepare_execute/4
(postgrex) lib/postgrex.ex:125: Postgrex.query/4
(moebius) lib/moebius/database.ex:236: Moebius.Database.execute/1
noticed that this exception also appears with postgrex:
iex(5)> {:ok, pid} = Postgrex.start_link(hostname: "localhost", database: "taped_dev", pool: DBConnection.Poolboy)
{:ok, #PID<0.350.0>}
iex(6)> Postgrex.query!(pid, "SELECT * FROM users", [])
** (UndefinedFunctionError) function :invalid_message.exception/1 is undefined (module :invalid_message is not available)
:invalid_message.exception([])
(db_connection) lib/db_connection.ex:925: DBConnection.checkout/2
(db_connection) lib/db_connection.ex:741: DBConnection.run/3
(db_connection) lib/db_connection.ex:1132: DBConnection.run_meter/3
(db_connection) lib/db_connection.ex:584: DBConnection.prepare_execute/4
(db_connection) lib/db_connection.ex:600: DBConnection.prepare_execute!/4
(postgrex) lib/postgrex.ex:146: Postgrex.query!/4
And solution is simple, we need to execute query with pool: config:
Postgrex.query!(pid, "SELECT * FROM users", [], pool: DBConnection.Poolboy)
This patches works for me: fazibear@07c94cb fazibear@c494bec
but is this a proper solution ?
Thanks
I have postgres setup in docker, via tcp, but when I call run_with_psql
it attempts to connect to a socket. Looking at the command, it seems that it never uses host/port stuff I configured. Wonder if this is intentional?
Aug 3 04:34:55 PM error: module Inflex is not loaded and could not be found
Aug 3 04:34:55 PM lib/moebius/query.ex:3: Moebius.Query (module)
Aug 3 04:34:55 PM
Aug 3 04:34:55 PM == Compilation error in file lib/moebius/query.ex ==
Aug 3 04:34:55 PM ** (CompileError) lib/moebius/query.ex: cannot compile module Moebius.Query (errors have been logged)
It's not clear why, but the Inflex library is not loading when pointing at hex.
Opened an issue at the Inflex lib: nurugger07/inflex#95
Using Moebius in a project will result in the following error during compilation with Elixir 1.2
cannot import Moebius.Query.with/1 because it conflicts with Elixir special forms
(elixir) src/elixir_import.erl:92: :elixir_import.calculate/6
(elixir) src/elixir_import.erl:22: :elixir_import.import/4
Elixir 1.2 added the with special form to match on multiple expressions
with {:ok, contents} <- File.read("my_file.ex"),
{res, binding} <- Code.eval_string(contents),
do: {:ok, res}
Is renaming Moebius.Query.with/1 the best option (like they did in Pavlov , see sproutapp/pavlov#49)?
Hi,
I am trying to learn how to use Moebius. It turns out that to do this I have to learn a good bit more about SQL and how to do migrations etc.
I am curious as to why the SQL in test_helper.exs
is duplicated in test_schema.sql
. I cannot find any piece of code that uses test_schema.sql
. Is it used anywhere?
Normally when I try to create the documentation from source code in Elixir projects (e.g. Poison, Phoenix, Ecto, etc.) I do the following:
$ MIX_ENV=docs mix docs
But in this case I got the following error:
** (FunctionClauseError) no function clause matching in Moebius.Mixfile.deps/1
mix.exs:25: Moebius.Mixfile.deps(:docs)
mix.exs:12: Moebius.Mixfile.project/0
(mix) lib/mix/project.ex:62: Mix.Project.push/3
(stdlib) lists.erl:1262: :lists.foldl/3
I'd love to follow what MassiveJS does for documents:
jsonb
contains queriesThis will be interesting.
How's that for a title? Currently the result map from the database comes back with string keys and I wrote a hackey thing to coerce_atoms
for dot notation. This only does a comprehension on the root keys - it won't go any deeper which, in the case of a regular query, is fine. When we move to doing more document stuff (jsonb
), however, we'll want this.
Unless I've completely missed something, working with dates and Postgrex is... rather frustrating. The only construct we can use that won't give Postgrex fits is %Postgrex.Timestamp
.
I'd like to make this less painful and I'm not completely sure how. We could:
Ugh. Dates.
Thinking a bit out loud here:
Instead of having to keep track of parameter positions in a query it would be useful to use named parameters, i.e.
SELECT * FROM users WHERE lname = :lname AND fname = :fname
Then you'd have a hash of params, in any order and may have unused entries, that would be combined for the result.
%{id: 1234, fname: "Charlie", lname: "Brown", dog: "snoopy", adversary: "Lucy"}
Under the covers this would get converted to a tuple that can then run as a normal pg query:
{"SELECT * FROM users WHERE lname = $1 AND fname = $2",["Brown", "Charlie"]}
Here's a rather terse way of doing it that I banged out this afternoon.
def convert_named_params(query, opts) do
named_param_regex = ~r/:(\w+)/i
# do the regex and convert the results to a map of {:atom, string}
m = Regex.scan(named_param_regex, query)
|> Enum.map(fn([v, a]) -> {String.to_atom(a), v} end)
# replace the named params in the query with $1...$n
{q, _} = Enum.reduce(m, {query, 1}, fn({_,v}, {q, i}) -> {String.replace(q, v, "$#{i}"), i + 1 } end)
# create a list of options to be used as params
o = Enum.map(m, fn({k,_}) -> opts[k] end)
{q, o}
end
I need to pass in some custom types. I've done this in my application itself by overriding Postgrex.Types. That works locally but fails when doing a build.
For example, I need something like this:
types = [Geo.PostGIS.Extension, Postgrex.Extensions.JSON, PostgrexTypes.UUID]
Postgrex.Types.define(Moebius.PostgrexTypes, types, json: Jason)
How do you feel about the approach allowing a list of type modules being loaded via Moebius config? Any suggestions on best practices for approaching this? Glad to work on this and submit a PR myself.
This can happen now with CTEs and SQL Files, but it would be nice to have a mechanism in code.
The Postgrex driver works OK for the most part, but the date/time limitations are frustrating (can only use it's struct and not a string) and the type limitations are extremely frustrating - throwing when encountering tsvector
for instance and not understanding that a string should be an atom.
I've thought about forking it but don't want to do a driver myself; and epgsql
has been around forever so... let's give it a whirl and see if it will work for us. I'd love to be able to let people work with dates in a better way - that's the prime motivator.
Does that mean we should step back to 1.1.0 of timex with last version of moebius ?
Failed to use "timex" because
moebius (version 2.0.0) requires ~> 1.0.0
Locked to 2.1.4 in your mix.lock
Ecto allows you to connect to a db using the url like so:
config :drip_emails, DripEmails.Repo,
adapter: Ecto.Adapters.Postgres,
url: System.get_env("DATABASE_URL"),
size: 20 # The amount of database connections in the pool
It would be great to connect to PG this way instead of having to specify the options individually.
Here's where this happens in Ecto:
def parse_url(""), do: []
def parse_url({:system, env}) when is_binary(env) do
parse_url(System.get_env(env) || "")
end
def parse_url(url) when is_binary(url) do
info = url |> URI.decode() |> URI.parse()
if is_nil(info.host) do
raise Ecto.InvalidURLError, url: url, message: "host is not present"
end
if is_nil(info.path) or not (info.path =~ ~r"^/([^/])+$") do
raise Ecto.InvalidURLError, url: url, message: "path should be a database name"
end
destructure [username, password], info.userinfo && String.split(info.userinfo, ":")
"/" <> database = info.path
opts = [username: username,
password: password,
database: database,
hostname: info.host,
port: info.port]
Enum.reject(opts, fn {_k, v} -> is_nil(v) end)
end
I'd like to have Moebius create a document table on save if one doesn't exist. The SQL would be something like...
create table [table_name](
id serial primary key not null,
body jsonb not null,
created_at timestamptz not null default now(),
updated_at timestamptz
)
We did this with Massive by simple catching an insert error. Might be a bit difficult the way we're doing it now since we're handing things to execute
which ... I'm still unsure about.
The following test fails:
test "documents save within a transaction" do
res = transaction fn(tx) ->
Moebius.DocumentQuery.db(:monkies) |> Moebius.DocumentQuery.searchable([:name]) |> TestDb.save(%{name: "Mike"}, tx)
end
case res do
{:error, _err} -> flunk "No errors here!"
res -> assert res
end
end
Hi, right now i'm not using async test, because of clearing database before each test.
def clear_database do
@tables
|> Enum.each(fn(table) ->
table |> db |> delete |> run
end)
end
Is it possible to get new connection for every test like ecto and enable async test ?
Are there any plans to open Moebius up to other DB engines / drivers?
I've just started a project (heavily influenced by what you're doing here) to wrap the TDS driver. Would my time be better spent looking into creating an adapter for Moebius instead?
Would be great to hear your thoughts!
I'm trying to do a query that would result in SQL like this:
select * from emails where deleted_at IS NULL;
# or
select * from emails where deleted_at IS NOT NULL;
db(:emails)
|> filter(:deleted_at, "IS NULL")
|> all
Results in:
** (FunctionClauseError) no function clause matching in Postgrex.Extensions.Binary.encode_timestamp/1
(postgrex) lib/postgrex/extensions/binary.ex:241: Postgrex.Extensions.Binary.encode_timestamp("IS NULL")
(postgrex) lib/postgrex/protocol.ex:299: anonymous fn/2 in Postgrex.Protocol.encode_params/1
(elixir) lib/enum.ex:1043: anonymous fn/3 in Enum.map/2
(elixir) lib/enum.ex:1387: Enum."-reduce/3-lists^foldl/2-0-"/3
(elixir) lib/enum.ex:1043: Enum.map/2
(postgrex) lib/postgrex/protocol.ex:294: Postgrex.Protocol.encode_params/1
(postgrex) lib/postgrex/protocol.ex:264: Postgrex.Protocol.send_params/2
(postgrex) lib/postgrex/protocol.ex:136: Postgrex.Protocol.message/3
Today, Moebius starts a connection directly on the caller process:
Lines 9 to 30 in ba2bc6a
This is must be avoided in Elixir. You want new process to always be part of a supervision tree, this way you can start :observer
and access all processes holding database connections. Not only, you can do a simple Supervisor.count_children
to know how many connections you have open at a given moment. In other words, supervision trees are extremely useful even if you don't have a restart strategy (i.e. all workers are temporary).
It is extremely important that we follow OTP conventions, otherwise many of the benefits we have been talking over and over about the VM and runtime will be lost (this was the disclaimer I was talking about on twitter).
Keep in mind this is not about pooling connections, handling transactions nor any of that. It is simply about following OTP principles. I thought the solution would be tiny but, since we want the connection to crash if the caller exits, we would need to either add this feature to potsgrex or support some sort of ownership model. Because I think this could be generally useful, I will talk to @fishcakez about making this part of db_connection
so postgrex
gets it for free soonish.
It's possible to prepare a query once, and execute the query on any process in the pool. This can save a round trip to the database. Please see https://github.com/fishcakez/postgrex_cache for an example - there should be a better API to handle this in the next release of postgrex. This will still work with pgbouncer etc if prepare: :unnamed
is passed to Postgrex.start_link
/child_spec
but has limited benefit (encodes params before checking out connection) because the query will always be prepared and executed.
Hello,
I have an umbrella with phoenix 1.2 and just added moebius, however i get an issue because moebius wants poison 3.0 and phoenix wants poison 2.0.
Do you know how i could deal with this.
`Running dependency resolution...
Failed to use "poison" (version 3.0.0) because
moebius (version 3.0.1) requires ~> 3.0.0
phoenix (version 1.2.1) requires ~> 1.5 or ~> 2.0
mix.lock specifies 3.0.0`
Cheers,
Kevin
Hi -- I use the following code to add an on conflict (col) do nothing
clause to the SQL statement Moebius generates and wonder if (a) this would be a generally useful addition to the library and (b) how detailed an implementation would be required to be general enough. (on conflict
can be arbitrarily complicated; personally I'd opt for a function for anything more involved than this simple statement but I know that might be an extreme position.)
@spec ignore_conflict(%Moebius.QueryCommand{}, String.t) ::
%Moebius.QueryCommand{}
@doc "Do nothing when an insert conflict is detected on COL."
def ignore_conflict(%Moebius.QueryCommand{} = cmd, col) do
conflict = "on conflict (#{col}) do nothing"
new = Regex.replace(~r/returning.+$/, cmd.sql, "#{conflict} \\0")
%{cmd | sql: new}
end
If there is interest, I'd be happy to supply a PR (with any guidance incorporated). Thanks!
Not a big deal this one but I found some mentions to a possible need to support a v3.
As far as I'm aware the convention for functions that fail is to return {:ok, value}
or {:error, reason}
. At the moment Moebius.Db.run(query)
(and a few others) doesn't put successful return values in an :ok tuple.
Would it be something that would be acceptable to change for a major update?
As I have the convention of putting successful paths first I keep having to pattern match on some contents of the result.
case MyDb.run(action) do
record = %{id: ^id} ->
{:ok, unpack(record)}
{:error, reason} ->
{:error, reason}
end
Certainly not the end of the world, for sure.
Hello!
I'm working on adding search to my database but for some reason, no matter what I do, the limit isn't honored when using Moebius.Query.search/2
I think this might be related to issue #98
Here's my code:
db(:my_table)
|> search(for: "term", in: [:field, :other_field])
|> limit(10)
|> MyApp.DB.run
For one particular query, I have a max of 33 results and it always returns all of them instead of limiting it to 10 results.
I'm having trouble updating a date with Moebius. I'm using Elixir 1.2.5 and Moebius 2.0.1.
This is my code.
def update_movie(release_date, movie) do
db(:movies)
|> filter(original_title: movie.original_title)
|> update(release_date: format_date(release_date))
|> LetterboxdCal.Db.run
end
def format_date(date_string) do
Timex.parse(date_string, "%Y-%m-%dT00:00:00.000Z", :strftime)
|> elem(1)
|> Timex.format("%Y-%m-%d", :strftime)
|> elem(1)
end
This is the error.
** (FunctionClauseError) no function clause matching in Moebius.Extensions.DateExtension.encode/4
(moebius) lib/moebius/extensions/date_extension.ex:27: Moebius.Extensions.DateExtension.encode(%Postgrex.TypeInfo{array_elem: 0, base_type: 0, comp_elems: [], input: "date_in", oid: 1082, output: "date_out", receive: "date_recv", send: "date_send", type: "date"}, "2016-07-21", 135216, nil)
(postgrex) lib/postgrex/query.ex:100: DBConnection.Query.Postgrex.Query.do_encode/4
(postgrex) lib/postgrex/query.ex:61: DBConnection.Query.Postgrex.Query.encode/3
(db_connection) lib/db_connection.ex:885: DBConnection.describe_execute/5
(db_connection) lib/db_connection.ex:1009: DBConnection.run_begin/3
(db_connection) lib/db_connection.ex:957: DBConnection.run_meter/3
(db_connection) lib/db_connection.ex:421: DBConnection.query/4
(postgrex) lib/postgrex.ex:111: Postgrex.query/4
This is the Postgres table definition.
Column | Type | Modifiers
----------------+--------------------------+-----------------------------------------------------
id | integer | not null default nextval('movies_id_seq'::regclass)
original_title | text | not null
release_date | date |
created_at | timestamp with time zone | not null default now()
updated_at | timestamp with time zone |
I've worked around it for now by using SQL directly, which is working.
def update_movie(release_date, movie) do
"UPDATE movies SET release_date = '#{format_date(release_date)}' WHERE original_title = '#{movie.original_title}';"
|> LetterboxdCal.Db.run
end
Any ideas what I'm doing wrong?
exited in: :gen_server.call(Moebius.Db, {:checkout, #Reference<0.0.2.13839>, true, 15000}, 5000) ** (EXIT) no process
What this means? Thanks.
If parameters don't match the query an ArgumentError will be raised in future versions of postgrex.
I want to be able to ...
db(:users) |> find(1)
Fascinating discussion at hamiltop/rethinkdb-elixir#89 in which Rob finds out that atoms are global and can cause runtime errors if not previously compiled.
For instance, if you create this test and run only this test:
defmodule AtomTest do
use ExUnit.Case
import Moebius.DocumentQuery
test "Pulling atoms might fail" do
db(:monkies) |> first |> IO.inspect
end
end
You will get this error (because "sku" is a key in the data which is not represented as a key anywhere in the JIT-compiled test - it's only present in other tests which are not compiled):
1) test Pulling atoms might fail (AtomTest)
test/atom_test.exs:8
** (ArgumentError) argument error
stacktrace:
:erlang.binary_to_existing_atom("sku", :utf8)
lib/poison/parser.ex:97: Poison.Parser.object_name/2
lib/poison/parser.ex:82: Poison.Parser.object_pairs/3
lib/poison/parser.ex:36: Poison.Parser.parse/2
lib/poison/parser.ex:50: Poison.Parser.parse!/2
Because Poison is trying to reuse an existing atom (it has to). We've never encountered this problem because our atoms are always present in the data we're writing.
Now, a logical assumption would be that atom keys should always be present in an app if we're writing them in the first place, which isn't the case. It would be a trivial matter to pass the params from a post directly into Postgres (which would be string-keyed) and then, when you pull them out later, you get the runtime problem.
This is the code example for the existence operator:
buddies = db(:friends)
|> exists(:tags, "best")
|> Moebius.Db.run
Where should the ?
go?
Not sure if this is a bug or a feature request, but I'm wondering how to return a single entry. For example, I have a series of entries that are similar but just have different ids and "created_at" fields (e.g. like a log file). I only want to get the most recent one. I thought it would work like this:
db(:my_db)
|> search(for: "/users/accounts", in: [:name])
|> sort(:id, :desc)
|> limit(1)
|> Polo.Db.run
but this returns all entries that match "/users/accounts".
When I inserted it, I used:
db(:my_db)
|> searchable([:name])
|> Polo.Db.save(name: name, value: body)
Any help is most appreciated!
I've got a function that that returns JSONB and the keys are bound as strings. Is there a way to get Moebius to return the keys as atoms, or is it a concern for the application to convert the keys to atoms?
Right now the processes are staying open (I believe) - which is probably bad. We should make sure the connections are released OR investigate Poolboy further.
Getting an issue when the %Postgrex.TypeInfo{type: "timestamp"}
. There isn't a match on that. I can add the fix but wanted to check with @xivSolutions first.
** (FunctionClauseError) no function clause matching in Moebius.Extensions.DateExtension.decode/4
(moebius) lib/moebius/extensions/date_extension.ex:47: Moebius.Extensions.DateExtension.decode(%Postgrex.TypeInfo{array_elem: 0, base_type: 0, comp_elems: [], input: "timestamp_in", oid: 1114, output: "timestamp_out", receive: "timestamp_recv", send: "timestamp_send", type: "timestamp"}, <<0, 1, 215, 143, 95, 251, 70, 53>>, 172079, nil)
(postgrex) lib/postgrex/query.ex:118: DBConnection.Query.Postgrex.Query.decode_row/4
(postgrex) lib/postgrex/query.ex:109: DBConnection.Query.Postgrex.Query.do_decode/5
(postgrex) lib/postgrex/query.ex:74: DBConnection.Query.Postgrex.Query.decode/3
(db_connection) lib/db_connection.ex:423: DBConnection.query/4
(postgrex) lib/postgrex.ex:111: Postgrex.query/4
(moebius) lib/moebius/database.ex:236: Moebius.Database.execute/1
(authentication) lib/moebius/database.ex:38: Authentication.Database.run/1
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.