This package provides a parser that focuses on nice diagnostics.
Contributions and bug reports are welcome!
Please feel free to contact me through github or on the #haskell IRC channel on irc.freenode.net.
-Edward Kmett
Parser combinators with highlighting, slicing, layout, literate comments, Clang-style diagnostics and the kitchen sink
Home Page: http://ekmett.github.com/trifecta/
License: Other
https://ghc.haskell.org/trac/ghc/ticket/12768 is triggered by these two class methods https://github.com/ekmett/trifecta/blob/master/src/Text/Trifecta/Combinators.hs#L61 when using GND to derive an instance of DeltaParsing
. This happens in the hnix
package for example: haskell-nix/hnix#47
src/Text/Trifecta/Combinators.hs:70:10:
Could not deduce (Show s)
arising from the superclasses of an instance declaration
from the context (MonadPlus m, DeltaParsing m)
bound by the instance declaration
at src/Text/Trifecta/Combinators.hs:70:10-72
Possible fix:
add (Show s) to the context of the instance declaration
In the instance declaration for ‘DeltaParsing (Lazy.StateT s m)’
src/Text/Trifecta/Combinators.hs:82:10:
Could not deduce (Show s)
arising from the superclasses of an instance declaration
from the context (MonadPlus m, DeltaParsing m)
bound by the instance declaration
at src/Text/Trifecta/Combinators.hs:82:10-74
Possible fix:
add (Show s) to the context of the instance declaration
In the instance declaration for ‘DeltaParsing (Strict.StateT s m)’
A complete build log is at http://hydra.cryp.to/build/158144/nixlog/3/raw.
Hello! I've got parser and I've found out that sometimes we get strange error messages, like the following one:
Parser:2:3: error: expected: "$", "*", "+", "-", "/", "<",
"==", ">", "^", "in", accessor (.),
ambiguous use of a left-associative operator,
ambiguous use of a non-associative operator,
ambiguous use of a right-associative operator, end of input,
letter or digit, operator, space
a; b
^
but I've got no minimal use case for this bug so far.
README says see http://ekmett.github.com/trifecta/ but that page 404s.
AfC
It would be nice to have explanations about what each constructor does, and each field in the constructors.
https://hackage.haskell.org/package/trifecta-1.5.2/docs/Text-Trifecta-Delta.html#t:Delta
All the concrete instances of TokenParsing discard highlighting information enitrely; there was a commented-out Highlighter transformer in the code at one point, but it looks unfinished. I'd write one myself but I'm afraid I don't understand the code well enough yet.
At a minimum it'd be nice to be able to get back highlighted error reports (as 0.53 had) and to generate highlighted versions of source fed through my parser.
There is an issue with the notion of end users using GND to derive new instances.
The issue is if we put whiteSpace
in TokenParser and expect users to override it, then if the user subclasses TokenParser
, and uses GND to derive those instances, then any of those methods that call whiteSpace
will call the one from the base type, due to the rules of GND!
In trifecta-1.5.1.1
, Text.Trifecta.Highlight
has restricted its import of Text.Blaze.Internal
. However, in blaze-markup < 0.3.6
, this is the only module that exports preEscapedString
, so that is not in scope when building this combination of versions. Starting with blaze-markup-0.3.6.0
, other modules including Text.Blaze
reexport preEscapedString
, so testing with most recent could not find this problem.
(Following the previous breakage with blaze-markup-0.3.6
, the constraint in Idris was tightened to below this, which now results in master failing again due to this issue.)
I'm getting slow compile times on a fairly small parser when I use a number of choice
combinators. It is averaging around 30 seconds for my parser module.
If I enable -ddump-timings
, I can see the dominant factor is the simplifier
1013 - Simplifier [Main]: alloc=1068672688 time=1013.738
1577 - Simplifier [Main]: alloc=1353713200 time=1577.066
1687 - Simplifier [Main]: alloc=1651513384 time=1687.849
1697 - Simplifier [Main]: alloc=1619116360 time=1697.374
1970 - Simplifier [Main]: alloc=1960801872 time=1970.856
2022 - Simplifier [Main]: alloc=1865558352 time=2022.854
If I build with a fork of trifecta I can half the compile time by removing the INLINE
pragma on
satisfy
for instance CharParsing Parser
in Text.Trifecta.Parser
. (I can't find any other INLINES that have similar impact)
Is this expected behaviour, or should the INLINE be removed?
Compiling with -O0
fixes the issue for development, but 30 second compile times did surprise me.
It's quite easy to replicate, something like this should do it
import qualified Text.Trifecta as T
start :: T.Parser String
start =
T.choice [ T.integer <&> show
, T.integer <&> show
, T.integer <&> show
, T.integer <&> show
Adding more T.integer <&> show
lines give you a roughly linear slow down. (On my machine ~150ms per line and my test has about 150 of them in total).
Thanks!
ghc
has a feature where its error messages often show the line numbers as prefixes on source lines:
I’d love to have the same for trifecta’s rendering of Span
s, especially given the (very nice!) feature where spans on long lines show only the relevant portion of the line.
I think this should be possible due to the inclusion of the Delta
s in Span
s, so if you’d be willing to review a PR with this, I’ll give it a try myself 👍
Assume I have a parser for a line of input:
parseLine :: Parser Line
I would like to extend this parser to add an optional question mark at the end of the input line and have the extended parser return whether or not the question mark was present:
parseLineExtended :: Parser (Line, Bool)
The issue is that the unmodified parseLine
parser will accept '?'
at the end of line as a valid input, so the following solution does not work:
parseNoQuestion :: Parser (Line, Bool)
parseNoQuestion = fmap (\line -> (line, False)) parseLine
parseQuestion :: Parser (Line, Bool)
parseQuestion = fmap (\line -> (line, True)) (parseLine <* char '?')
parseLineExtended :: Parser (A, Bool)
parseLineExtended = try parseFlag <|> parseNoFlag
The reason the above solution does not work is that parseLine
might consume the question mark at the end of the line, and parseFlag
will then fail.
So my question is: Is there a way I can write parseLineExtended
in terms of parseLine
without making invasive changes to parseLine
?
Building Idris (both the Hackage release and master) with Trifecta 1.4.3 yields the following error:
[68 of 87] Compiling Idris.ParseHelpers ( src/Idris/ParseHelpers.hs, dist/build/Idris/ParseHelpers.o )
src/Idris/ParseHelpers.hs:370:47:
No instance for (Show IState) arising from a use of ‘position’
In the second argument of ‘liftM’, namely ‘position’
In the expression: liftM ((+ 1) . fromIntegral . column) position
In an equation for ‘indent’:
indent = liftM ((+ 1) . fromIntegral . column) position
With Trifecta 1.4.2, this error does not occur.
The change seems to be a result of commit 16cd121. It seems odd that we should be required to have a Show
instance for the internal state.
src/Text/Trifecta/Util/IntervalMap.hs:167:20:
Ambiguous occurrence ‘:<’
It could refer to either ‘Control.Lens.:<’,
imported from ‘Control.Lens’ at src/Text/Trifecta/Util/IntervalMap.hs:51:1-38
(and originally defined in ‘Control.Lens.Cons’)
or ‘Data.FingerTree.:<’,
imported from ‘Data.FingerTree’ at src/Text/Trifecta/Util/IntervalMap.hs:53:51-59
src/Text/Trifecta/Util/IntervalMap.hs:174:20:
Ambiguous occurrence ‘:<’
It could refer to either ‘Control.Lens.:<’,
imported from ‘Control.Lens’ at src/Text/Trifecta/Util/IntervalMap.hs:51:1-38
(and originally defined in ‘Control.Lens.Cons’)
or ‘Data.FingerTree.:<’,
imported from ‘Data.FingerTree’ at src/Text/Trifecta/Util/IntervalMap.hs:53:51-59
src/Text/Trifecta/Util/IntervalMap.hs:222:14:
Ambiguous occurrence ‘:<’
It could refer to either ‘Control.Lens.:<’,
imported from ‘Control.Lens’ at src/Text/Trifecta/Util/IntervalMap.hs:51:1-38
(and originally defined in ‘Control.Lens.Cons’)
or ‘Data.FingerTree.:<’,
imported from ‘Data.FingerTree’ at src/Text/Trifecta/Util/IntervalMap.hs:53:51-59
Trifecta-1.5.1.3 on Hackage builds fine with lens 4.12.3. Please update the .cabal file on Hackage to set lens >=4 && <=4.12.3.
I tested two statements.
λ> parseString ((char 'a' >> char 'b') <|> char 'c') mempty "ac"
Failure (interactive):1:2: error: expected: "b"
ac<EOF>
^
λ> parseString (try (char 'a' >> char 'b') <|> char 'c') mempty "ac"
Failure (interactive):1:1: error: expected: "c"
ac<EOF>
^
In the first statement, char 'c'
is not even tried.
In the second, char 'c'
is tried before failing.
It seems backtracking doesn't work properly if the first argument to <|>
is not a single parser but a set of parsers strung by >>
or <*
. try
probably turns a set of parsers into a parser.
My hypothesis is that >>
or <*
escapes <|>
.
Thanks to another IRC user, I was able to get Text parsing with Trifecta via this code:
-- Text Rope and parsing
instance Reducer Text Rope where
unit = unit . strand . encodeUtf8
cons = cons . strand . encodeUtf8
snoc r = snoc r . strand . encodeUtf8
parseText :: Parser a -> Delta -> Text -> Result a
parseText p d inp =
starve $ feed inp $ stepParser (release d *> p)
mempty mempty
But the copying makes me unhappy. I asked in IRC but no-one really knew, why does Trifecta only support UTF-8 ByteStrings as a first-class input stream?
It's stated in the documentation for triffecta
package that it has Clang-style colored diagnostics. But I can't find viewable example of this diagnostics. It would be really great to add screenshot of nice-looking errors in the README or directly to Hackage documentation.
trifecta-1.1 builds fine with ghc 7.6.3 on Gentoo amd64. I had loosened the
the blaze-html and fingetree deps. Bumping it to 1.2, I also loosened the deps
so I could compile it. For me, the compile fails, so just wanted to report that, thanks.
argus trifecta # diff -wc trifecta-1.1.ebuild trifecta-1.2.ebuild
*** trifecta-1.1.ebuild Sat Sep 28 15:16:56 2013
--- trifecta-1.2.ebuild Tue Oct 1 20:56:32 2013
***************
*** 21,32 ****
RDEPEND=">=dev-haskell/ansi-terminal-0.6:=[profile?] <dev-haskell/ansi-terminal-0.7:=[profile?]
>=dev-haskell/ansi-wl-pprint-0.6.6:=[profile?] <dev-haskell/ansi-wl-pprint-0.7:=[profile?]
>=dev-haskell/blaze-builder-0.3.0.1:=[profile?] <dev-haskell/blaze-builder-0.4:=[profile?]
! >=dev-haskell/blaze-html-0.5:=[profile?] <dev-haskell/blaze-html-0.7:=[profile?]
>=dev-haskell/blaze-markup-0.5:=[profile?] <dev-haskell/blaze-markup-0.6:=[profile?]
>=dev-haskell/charset-0.3.2.1:=[profile?] <dev-haskell/charset-1:=[profile?]
>=dev-haskell/comonad-3:=[profile?] <dev-haskell/comonad-4:=[profile?]
>=dev-haskell/deepseq-1.2.0.1:=[profile?] <dev-haskell/deepseq-1.4:=[profile?]
! >=dev-haskell/fingertree-0.0.1:=[profile?] <dev-haskell/fingertree-0.2:=[profile?]
>=dev-haskell/hashable-1.2:=[profile?] <dev-haskell/hashable-1.3:=[profile?]
>=dev-haskell/lens-3.8.2:=[profile?] <dev-haskell/lens-4:=[profile?]
>=dev-haskell/mtl-2.0.1:=[profile?] <dev-haskell/mtl-2.2:=[profile?]
--- 21,32 ----
RDEPEND=">=dev-haskell/ansi-terminal-0.6:=[profile?] <dev-haskell/ansi-terminal-0.7:=[profile?]
>=dev-haskell/ansi-wl-pprint-0.6.6:=[profile?] <dev-haskell/ansi-wl-pprint-0.7:=[profile?]
>=dev-haskell/blaze-builder-0.3.0.1:=[profile?] <dev-haskell/blaze-builder-0.4:=[profile?]
! >=dev-haskell/blaze-html-0.5:=[profile?] <dev-haskell/blaze-html-0.6:=[profile?]
>=dev-haskell/blaze-markup-0.5:=[profile?] <dev-haskell/blaze-markup-0.6:=[profile?]
>=dev-haskell/charset-0.3.2.1:=[profile?] <dev-haskell/charset-1:=[profile?]
>=dev-haskell/comonad-3:=[profile?] <dev-haskell/comonad-4:=[profile?]
>=dev-haskell/deepseq-1.2.0.1:=[profile?] <dev-haskell/deepseq-1.4:=[profile?]
! >=dev-haskell/fingertree-0.0.1:=[profile?] <dev-haskell/fingertree-0.1:=[profile?]
>=dev-haskell/hashable-1.2:=[profile?] <dev-haskell/hashable-1.3:=[profile?]
>=dev-haskell/lens-3.8.2:=[profile?] <dev-haskell/lens-4:=[profile?]
>=dev-haskell/mtl-2.0.1:=[profile?] <dev-haskell/mtl-2.2:=[profile?]
***************
*** 42,50 ****
>=dev-haskell/cabal-1.10
test? ( >=dev-haskell/doctest-0.9.1 )
"
-
- src_prepare() {
- cabal_chdeps \
- 'blaze-html >= 0.5 && < 0.6' 'blaze-html >= 0.5 && < 0.7' \
- 'fingertree >= 0.0.1 && < 0.1' 'fingertree >= 0.0.1 && < 0.2'
- }
--- 42,44 ----
argus trifecta # mv trifecta-1.1.ebuild trifecta-1.2.ebuild
argus trifecta # repoman fix
RepoMan scours the neighborhood...
>>> Creating Manifest for /var/lib/layman/haskell/dev-haskell/trifecta
ebuild.notadded 1
dev-haskell/trifecta/trifecta-1.2.ebuild
Note: use --include-dev (-d) to check dependencies for 'dev' profiles
RepoMan sez: "You're only giving me a partial QA payment?
I'll take it this time, but I'm not happy."
argus trifecta # emerge -av dev-haskell/trifecta
These are the packages that would be merged, in order:
Calculating dependencies... done!
[ebuild U ~] dev-haskell/trifecta-1.2:0/1.2::gentoo-haskell [1.1:0/1.1::gentoo-haskell] USE="doc hoogle hscolour profile {test}" 0 kB
Total: 1 package (1 upgrade), Size of downloads: 0 kB
Would you like to merge these packages? [Yes/No] y
>>> Verifying ebuild manifests
>>> Emerging (1 of 1) dev-haskell/trifecta-1.2 from gentoo-haskell
>>> Failed to emerge dev-haskell/trifecta-1.2, Log file:
>>> '/var/tmp/portage/dev-haskell/trifecta-1.2/temp/build.log'
>>> Jobs: 0 of 1 complete, 1 failed Load avg: 0.66, 0.33, 0.17
* Package: dev-haskell/trifecta-1.2
* Repository: gentoo-haskell
* Maintainer: [email protected]
* USE: amd64 doc elibc_glibc hoogle hscolour kernel_linux profile test userland_GNU
* FEATURES: compressdebug installsources preserve-libs sandbox splitdebug test userpriv usersandbox
>>> Unpacking source...
>>> Unpacking trifecta-1.2.tar.gz to /var/tmp/portage/dev-haskell/trifecta-1.2/work
>>> Source unpacked in /var/tmp/portage/dev-haskell/trifecta-1.2/work
>>> Preparing source in /var/tmp/portage/dev-haskell/trifecta-1.2/work/trifecta-1.2 ...
* CHDEP: 'blaze-html >= 0.5 && < 0.6' -> 'blaze-html >= 0.5 && < 0.7'
* CHDEP: 'fingertree >= 0.0.1 && < 0.1' -> 'fingertree >= 0.0.1 && < 0.2'
>>> Source prepared.
>>> Configuring source in /var/tmp/portage/dev-haskell/trifecta-1.2/work/trifecta-1.2 ...
* Using cabal-1.16.0.3.
* Prepending /usr/lib64/ghc-7.6.3 to LD_LIBRARY_PATH
/usr/bin/ghc -package Cabal-1.16.0.3 --make /var/tmp/portage/dev-haskell/trifecta-1.2/work/trifecta-1.2/Setup.lhs -dynamic -o setup
[1 of 1] Compiling Main ( /var/tmp/portage/dev-haskell/trifecta-1.2/work/trifecta-1.2/Setup.lhs, /var/tmp/portage/dev-haskell/trifecta-1.2/work/trifecta-1.2/Setup.o )
Linking setup ...
./setup configure --ghc --prefix=/usr --with-compiler=/usr/bin/ghc --with-hc-pkg=/usr/bin/ghc-pkg --prefix=/usr --libdir=/usr/lib64 --libsubdir=trifecta-1.2/ghc-7.6.3 --datadir=/usr/share/ --datasubdir=trifecta-1.2/ghc-7.6.3 --with-haddock=/usr/bin/haddock --enable-library-profiling --enable-tests --ghc-option=-optl-Wl,--hash-style=gnu --ghc-option=-optl-Wl,-O1 --ghc-option=-optl-Wl,--as-needed --disable-executable-stripping --docdir=/usr/share/doc/trifecta-1.2 --verbose
Configuring trifecta-1.2...
Dependency ansi-terminal ==0.6.*: using ansi-terminal-0.6
Dependency ansi-wl-pprint >=0.6.6 && <0.7: using ansi-wl-pprint-0.6.6
Dependency array >=0.3.0.2 && <0.5: using array-0.4.0.1
Dependency base >=4.4 && <5: using base-4.6.0.1
Dependency blaze-builder >=0.3.0.1 && <0.4: using blaze-builder-0.3.1.1
Dependency blaze-html >=0.5 && <0.7: using blaze-html-0.6.1.1
Dependency blaze-markup ==0.5.*: using blaze-markup-0.5.1.5
Dependency bytestring >=0.9.1 && <0.11: using bytestring-0.10.0.2
Dependency charset >=0.3.2.1 && <1: using charset-0.3.5
Dependency comonad ==3.*: using comonad-3.1
Dependency containers >=0.3 && <0.6: using containers-0.5.0.0
Dependency deepseq >=1.2.0.1 && <1.4: using deepseq-1.3.0.1
Dependency directory >=1.0: using directory-1.2.0.1
Dependency doctest >=0.9.1: using doctest-0.9.8
Dependency filepath -any: using filepath-1.3.0.1
Dependency fingertree >=0.0.1 && <0.2: using fingertree-0.1.0.0
Dependency ghc-prim -any: using ghc-prim-0.3.0.0
Dependency hashable ==1.2.*: using hashable-1.2.1.0
Dependency lens >=3.8.2 && <4: using lens-3.9.1
Dependency mtl >=2.0.1 && <2.2: using mtl-2.1.2
Dependency parsers >=0.5 && <1: using parsers-0.9
Dependency reducers ==3.*: using reducers-3.0.2
Dependency semigroups >=0.8.3.1 && <1: using semigroups-0.11
Dependency transformers >=0.2 && <0.4: using transformers-0.3.0.0
Dependency unordered-containers >=0.2.1 && <0.3: using
unordered-containers-0.2.3.3
Dependency utf8-string >=0.3.6 && <0.4: using utf8-string-0.3.7
Using Cabal-1.16.0.3 compiled by ghc-7.6
Using compiler: ghc-7.6.3
Using install prefix: /usr
Binaries installed in: /usr/bin
Libraries installed in: /usr/lib64/trifecta-1.2/ghc-7.6.3
Private binaries installed in: /usr/libexec
Data files installed in: /usr/share/trifecta-1.2/ghc-7.6.3
Documentation installed in: /usr/share/doc/trifecta-1.2
Using alex version 3.1.0 found on system at: /usr/bin/alex
Using ar found on system at: /usr/bin/ar
Using c2hs version 0.16.5 found on system at: /usr/bin/c2hs
Using cpphs version 1.17.1 found on system at: /usr/bin/cpphs
No ffihugs found
Using gcc version 4.7.3 found on system at: /usr/bin/gcc
Using ghc version 7.6.3 given by user at: /usr/bin/ghc
Using ghc-pkg version 7.6.3 given by user at: /usr/bin/ghc-pkg
No greencard found
Using haddock version 2.13.2.1 given by user at: /usr/bin/haddock
Using happy version 1.19.0 found on system at: /usr/bin/happy
No hmake found
Using hpc version 0.6 found on system at: /usr/bin/hpc
Using hsc2hs version 0.67 found on system at: /usr/bin/hsc2hs
Using hscolour version 1.20 found on system at: /usr/bin/HsColour
No hugs found
No jhc found
Using ld found on system at: /usr/bin/ld
No lhc found
No lhc-pkg found
No nhc98 found
Using pkg-config version 0.28 found on system at: /usr/bin/pkg-config
Using ranlib found on system at: /usr/bin/ranlib
Using strip found on system at: /usr/bin/strip
Using tar found on system at: /bin/tar
No uhc found
>>> Source configured.
>>> Compiling source in /var/tmp/portage/dev-haskell/trifecta-1.2/work/trifecta-1.2 ...
./setup build
Building trifecta-1.2...
Preprocessing library trifecta-1.2...
[ 1 of 13] Compiling Text.Trifecta.Util.Array ( src/Text/Trifecta/Util/Array.hs, dist/build/Text/Trifecta/Util/Array.o )
[ 2 of 13] Compiling Text.Trifecta.Util.Combinators ( src/Text/Trifecta/Util/Combinators.hs, dist/build/Text/Trifecta/Util/Combinators.o )
[ 3 of 13] Compiling Text.Trifecta.Util.IntervalMap ( src/Text/Trifecta/Util/IntervalMap.hs, dist/build/Text/Trifecta/Util/IntervalMap.o )
[ 4 of 13] Compiling Text.Trifecta.Instances ( src/Text/Trifecta/Instances.hs, dist/build/Text/Trifecta/Instances.o )
[ 5 of 13] Compiling Text.Trifecta.Delta ( src/Text/Trifecta/Delta.hs, dist/build/Text/Trifecta/Delta.o )
[ 6 of 13] Compiling Text.Trifecta.Rope ( src/Text/Trifecta/Rope.hs, dist/build/Text/Trifecta/Rope.o )
[ 7 of 13] Compiling Text.Trifecta.Util.It ( src/Text/Trifecta/Util/It.hs, dist/build/Text/Trifecta/Util/It.o )
[ 8 of 13] Compiling Text.Trifecta.Highlight ( src/Text/Trifecta/Highlight.hs, dist/build/Text/Trifecta/Highlight.o )
Loading package ghc-prim ... linking ... done.
Loading package integer-gmp ... linking ... done.
Loading package base ... linking ... done.
Loading package array-0.4.0.1 ... linking ... done.
Loading package deepseq-1.3.0.1 ... linking ... done.
Loading package bytestring-0.10.0.2 ... linking ... done.
Loading package utf8-string-0.3.7 ... linking ... done.
Loading package containers-0.5.0.0 ... linking ... done.
Loading package text-0.11.3.1 ... linking ... done.
Loading package hashable-1.2.1.0 ... linking ... done.
Loading package nats-0.1.2 ... linking ... done.
Loading package unordered-containers-0.2.3.3 ... linking ... done.
Loading package semigroups-0.11 ... linking ... done.
Loading package tagged-0.7 ... linking ... done.
Loading package transformers-0.3.0.0 ... linking ... done.
Loading package comonad-3.1 ... linking ... done.
Loading package fingertree-0.1.0.0 ... linking ... done.
Loading package transformers-compat-0.1.1.1 ... linking ... done.
Loading package contravariant-0.4.4 ... linking ... done.
Loading package distributive-0.3.1 ... linking ... done.
Loading package semigroupoids-3.1 ... linking ... done.
Loading package comonad-transformers-3.1 ... linking ... done.
Loading package mtl-2.1.2 ... linking ... done.
Loading package comonads-fd-3.0.3 ... linking ... done.
Loading package bifunctors-3.2.0.1 ... linking ... done.
Loading package profunctors-3.3.0.1 ... linking ... done.
Loading package free-3.4.2 ... linking ... done.
Loading package keys-3.0.3 ... linking ... done.
Loading package data-default-class-0.0.1 ... linking ... done.
Loading package data-default-instances-base-0.0.1 ... linking ... done.
Loading package data-default-instances-containers-0.0.1 ... linking ... done.
Loading package dlist-0.5 ... linking ... done.
Loading package data-default-instances-dlist-0.0.1 ... linking ... done.
Loading package old-locale-1.0.0.5 ... linking ... done.
Loading package data-default-instances-old-locale-0.0.1 ... linking ... done.
Loading package data-default-0.5.3 ... linking ... done.
Loading package stm-2.4.2 ... linking ... done.
Loading package pointed-3.1 ... linking ... done.
Loading package reducers-3.0.2 ... linking ... done.
Loading package charset-0.3.5 ... linking ... done.
Loading package parsers-0.9 ... linking ... done.
Loading package extensible-exceptions-0.1.1.4 ... linking ... done.
Loading package MonadCatchIO-transformers-0.3.0.0 ... linking ... done.
Loading package filepath-1.3.0.1 ... linking ... done.
Loading package pretty-1.1.1.0 ... linking ... done.
Loading package template-haskell ... linking ... done.
Loading package generic-deriving-1.6.2 ... linking ... done.
Loading package parallel-3.2.0.3 ... linking ... done.
Loading package groupoids-3.0.1.1 ... linking ... done.
Loading package semigroupoid-extras-3.0.1 ... linking ... done.
Loading package profunctor-extras-3.3.3.1 ... linking ... done.
Loading package reflection-1.3.2 ... linking ... done.
Loading package split-0.2.2 ... linking ... done.
Loading package primitive-0.5.1.0 ... linking ... done.
Loading package vector-0.10.9.1 ... linking ... done.
Loading package void-0.6.1 ... linking ... done.
Loading package lens-3.9.1 ... linking ... done.
Loading package blaze-builder-0.3.1.1 ... linking ... done.
Loading package blaze-markup-0.5.1.5 ... linking ... done.
Loading package blaze-html-0.6.1.1 ... linking ... done.
Loading package time-1.4.0.1 ... linking ... done.
Loading package unix-2.6.0.1 ... linking ... done.
Loading package ansi-terminal-0.6 ... linking ... done.
Loading package ansi-wl-pprint-0.6.6 ... linking ... done.
[ 9 of 13] Compiling Text.Trifecta.Rendering ( src/Text/Trifecta/Rendering.hs, dist/build/Text/Trifecta/Rendering.o )
[10 of 13] Compiling Text.Trifecta.Combinators ( src/Text/Trifecta/Combinators.hs, dist/build/Text/Trifecta/Combinators.o )
[11 of 13] Compiling Text.Trifecta.Result ( src/Text/Trifecta/Result.hs, dist/build/Text/Trifecta/Result.o )
[12 of 13] Compiling Text.Trifecta.Parser ( src/Text/Trifecta/Parser.hs, dist/build/Text/Trifecta/Parser.o )
src/Text/Trifecta/Parser.hs:100:20:
No instance for (Monoid a) arising from a use of `mappend'
Possible fix:
add (Monoid a) to the context of the instance declaration
In the first argument of `liftA2', namely `mappend'
In the expression: liftA2 mappend
In an equation for `mappend': mappend = liftA2 mappend
src/Text/Trifecta/Parser.hs:102:17:
No instance for (Monoid a) arising from a use of `mempty'
Possible fix:
add (Monoid a) to the context of the instance declaration
In the first argument of `pure', namely `mempty'
In the expression: pure mempty
In an equation for `mempty': mempty = pure mempty
* ERROR: dev-haskell/trifecta-1.2::gentoo-haskell failed (compile phase):
* setup build failed
*
* Call stack:
* ebuild.sh, line 93: Called src_compile
* environment, line 2977: Called haskell-cabal_src_compile
* environment, line 2278: Called cabal_src_compile
* environment, line 744: Called cabal-build
* environment, line 515: Called die
* The specific snippet of code:
* ./setup "$@" || die "setup build failed"
*
* If you need support, post the output of `emerge --info '=dev-haskell/trifecta-1.2::gentoo-haskell'`,
* the complete build log and the output of `emerge -pqv '=dev-haskell/trifecta-1.2::gentoo-haskell'`.
* The complete build log is located at '/var/tmp/portage/dev-haskell/trifecta-1.2/temp/build.log'.
* The ebuild environment file is located at '/var/tmp/portage/dev-haskell/trifecta-1.2/temp/environment'.
* Working directory: '/var/tmp/portage/dev-haskell/trifecta-1.2/work/trifecta-1.2'
* S: '/var/tmp/portage/dev-haskell/trifecta-1.2/work/trifecta-1.2'
* Messages for package dev-haskell/trifecta-1.2:
* ERROR: dev-haskell/trifecta-1.2::gentoo-haskell failed (compile phase):
* setup build failed
*
* Call stack:
* ebuild.sh, line 93: Called src_compile
* environment, line 2977: Called haskell-cabal_src_compile
* environment, line 2278: Called cabal_src_compile
* environment, line 744: Called cabal-build
* environment, line 515: Called die
* The specific snippet of code:
* ./setup "$@" || die "setup build failed"
*
* If you need support, post the output of `emerge --info '=dev-haskell/trifecta-1.2::gentoo-haskell'`,
* the complete build log and the output of `emerge -pqv '=dev-haskell/trifecta-1.2::gentoo-haskell'`.
* The complete build log is located at '/var/tmp/portage/dev-haskell/trifecta-1.2/temp/build.log'.
* The ebuild environment file is located at '/var/tmp/portage/dev-haskell/trifecta-1.2/temp/environment'.
* Working directory: '/var/tmp/portage/dev-haskell/trifecta-1.2/work/trifecta-1.2'
* S: '/var/tmp/portage/dev-haskell/trifecta-1.2/work/trifecta-1.2'
argus trifecta #
Hello!
While reading Haskell book I came across trifecta
I'm trying to wrap my head around but still not able to understand <|>
in simple word (<|>) = Monadic Choose ?
p = a <|> b -- use parser a if not then use b ?
if yes then why following parser is failing ?
parseFraction :: Parser Rational
parseFraction = do
numerator <- decimal
char '/'
denominator <- decimal
case denominator of
0 -> fail "denominator cannot be zero"
_ -> return (numerator % denominator)
type RationalOrDecimal = Either Rational Integer
parseRationalOrDecimal = (Left <$> parseFraction) <|> (Right<$> decimal)
main = do
let p f i = parseString f mempty i
print $ p (some (skipMany (oneOf "\n") *> parseRationalOrDecimal <* skipMany (oneOf "\n"))) "10"
in perfect word if a is parseFraction is going to fail then decimal should work.
1-what I'm missing ?
2-why we need to use try when <|> should run second parser on first failure ?
parseRationalOrDecimal = try (Left <$> parseFraction) <|> (Right<$> decimal)
The second argument of parseString
is a Delta
:
parseString :: Parser a -> Delta -> String -> Result a
... but I don't know what Delta
I should pass to parseString
. My initial guess was (Columns 0 0)
and that works okay, but I have no clue why or if there is a better solution that I'm missing.
Note that the reason I'm not using parseFromFile
is that I'm writing a shell, which needs to parse interactive input. This is the other reason why I suspect I'm doing something wrong, because the error has incorrect line information (always line 1, most likely because I'm passing in the wrong Delta
).
Is it possible to get the position of an error from a failed parse? I see that "Result" is either
Success a
Failure Doc
But it doesn't seem like I'd be able to extract the line number from the Doc
The docs for Err
also indicate that position isn't included, but not the reason why.
Is there a way to get this information?
Seems to be working fine with the latest versions of those, for me.
reducers 3.10.1, a dependency of trifecta, specifies comonad == 4.* and fingertree == 0.1.*
trifecta/src/Text/Trifecta/Result.hs
Lines 116 to 119 in 16e065c
Currently it is not possible to use Text.Trifecta.Result
in a test-case for directly for comparison because it doesn't derive from Eq
. Result
should also derive from Eq
so that there can be tests for parsers written with trifecta.
Workarounds known for this:
Result
and compare the wrapped contents as necessaryI'm trying to get incremental parsing of values/tokens repetition.
But it seems that I can't get parser re-started from new offset for processing leftover and further chunks.
Consider simple case where we parsed first byte/char "y"
and we have leftover "x"
"yx" `feed` stepParser (char 'x') (Columns 1 1)
StepCont (Rope (Columns 2 2) (fromList [Strand "yx" (Columns 2 2)])) (Success 'x') ...
This works great, but I don't want to re-supply dummy input and want to benefit of Skipping
strand:
"x" `feed` (Skipping (Columns 1 1) `feed` stepParser (char 'x') (Columns 1 1))
StepCont (Rope (Columns 2 2) (fromList [Skipping (Columns 1 1),Strand "x" (Columns 1 1)])) (Failure (ErrInfo {_errDoc = (interactive):1:2: error: unexpected EOF, expected: "x"
1 | x<EOF>
| ^ , _errDeltas = [Columns 1 1]})) ...
I would expect there is sort of equivalence
(stepParser p mempty) ~ (Skipping pos `feed` stepParser p pos)
(and difference is only in numeric errors offset)
It would be great to have a new release of trifecta on Hackage since currently all its reverse dependencies like Idris are dead-in-the-water without patching if GHC 8.0.2 is used.
trifecta-1.1
requires hashable == 1.2.*
, but it seems to compile with hashable-1.1.2.5, which is shipped with Haskell Platform 2013.2.0.
A GHC 8.2 compatible release would be appreciated.
The error message is:
src/Text/Trifecta/Highlight.hs:46:15:
Ambiguous occurrence ‘Comment’
It could refer to either ‘Text.Blaze.Internal.Comment’,
imported from ‘Text.Blaze.Internal’ at src/Text/Trifecta/Highlight.hs:35:1-26
or ‘Text.Parser.Token.Highlight.Comment’,
imported from ‘Text.Parser.Token.Highlight’ at src/Text/Trifecta/Highlight.hs:36:1-34
A complete build log is available at http://hydra.cryp.to/build/493580/nixlog/1/raw.
Consider the following snippet:
{-# LANGUAGE OverloadedStrings #-}
module Main where
import Text.Trifecta
import Data.ByteString.Char8()
main :: IO ()
main = parseTest p "hello"
where p = do
string "h"
(_ :~ s) <- spanned $ string "ello"
warnAt [] "Fix this" $ render $ Fixit s "should be OMFGLOLOL"
When run it gives this output:
$ runghc trifecta.hs
(interactive):1:2: warning: Fix this
hello<EOF>
~~~~
should be>
()
You would normally expect the fixit string to be as long as necessary.
Something seems wrong with either stepParser
or feed
: The first of these tests pass while the second does not:
case_incrementality1 :: Assertion
case_incrementality1 =
unsafeFS (parseByteString pa mempty fullstr)
@=? unsafeFS (starve (feed tstr (feed istr (stepParser (release mempty *> pa) mempty B8.empty))))
where
fullstr = B8.concat [istr, tstr]
istr = B8.pack "(a"
tstr = B8.pack "a)"
case_incrementality2 =
unsafeFS (parseByteString pa mempty fullstr)
@=? unsafeFS (starve (feed tstr (stepParser (release mempty *> pa) mempty istr)))
where
fullstr = B8.concat [istr, tstr]
istr = B8.pack "(a"
tstr = B8.pack "a)"
where
pa :: Parser String
pa = parens (many $ char 'a')
unsafeFS :: Result t -> t
unsafeFS (Success a) = a
unsafeFS (Failure td) = error $ "Errors: " ++ show td
In particular, the second fails with
ERROR: Errors: (interactive):1:1: error: expected: "("
a)<EOF>
^
I'm not sure if this is a misunderstanding on my end or a bug in the implementation.
Consider these two cases:
Prelude Text.Trifecta> starve (feed "e x" (feed "h" (stepParser (release mempty *> some (notFollowedBy someSpace *> anyChar) <* someSpace) mempty mempty)))
Success "he"
Prelude Text.Trifecta> starve (feed "e x" (feed "h" (stepParser (release mempty *> sliced (some (notFollowedBy someSpace *> anyChar)) <* someSpace) mempty mempty)))
Success ""
In particular, that parser really ought not be able to accept the empty string, since it is just a combination of lookahead and "some anyChar".
That empty string is, in particular, the mempty
passed to wantIt
in sliceIt
. As might be expected, calling one more feed
(with mempty, even) before calling starve
causes the correct answer to emerge.
I am not smart enough to understand what It
is doing, but this sure seems like a bug.
Trifecta doesn't show the unexpected token in the error message like parsec does in the following example:
import Text.Parser.Token
import Text.Parser.Char
import Text.Parser.Combinators
import Text.Parser.Token.Highlight
import qualified Text.Parsec as ParsingLib
-- import qualified Text.Trifecta as ParsingLib
import Control.Applicative
import Data.HashSet (fromList)
identStyle :: CharParsing m => IdentifierStyle m
identStyle = IdentifierStyle
{ _styleName = "identifier"
, _styleStart = letter
, _styleLetter = letter
, _styleReserved = fromList ["if", "then", "else"]
, _styleHighlight = Identifier
, _styleReservedHighlight = ReservedIdentifier
}
data Term = If Term Term Term
| Id String
deriving Show
identifier :: (Monad m, TokenParsing m) => m String
identifier = ident identStyle <?> "identifier"
reserved :: (Monad m, TokenParsing m) => String -> m ()
reserved = reserve identStyle
ifExpr :: (Monad m, TokenParsing m) => m Term
ifExpr = If <$> (reserved "if" *> term) <*> (reserved "then" *> term) <*> (reserved "else" *> term)
term :: (Monad m, TokenParsing m) => m Term
term = ifExpr <|> (Id <$> identifier)
main :: IO ()
main = do
ParsingLib.parseTest term "if true then false else null"
putStrLn "--"
ParsingLib.parseTest term "if true then false then false"
putStrLn "--"
ParsingLib.parseTest term "else"
If (Id "true") (Id "false") (Id "null")
--
parse error at (line 1, column 20):
unexpected "t"
expecting "else"
--
parse error at (line 1, column 5):
unexpected reserved identifier "else"
expecting identifier
If (Id "true") (Id "false") (Id "null")
--
(interactive):1:20: error: expected: "else"
if true then false then false<EOF>
^
--
(interactive):1:1: error: expected: "if", identifier
else<EOF>
^
The unexpected token is very useful in this case because it shows the reason why else
is not a valid identifier (because it is reserved).
#28 pointed out that we need a race
combinator for when (<|>)
doesn't do what someone expects. We should do that.
Any chance of getting a new release tag soon? This seems to be a blocker for Idris at the moment.
This works as expected:
Prelude Text.Trifecta> parseTest (string "foo" >> string "bar") "foo bar"
(interactive):1:4: error: expected: "bar"
foo bar<EOF>
^
But if I add try
:
Prelude Text.Trifecta> parseTest (try $ string "foo" >> string "bar") "foo bar"
(interactive):1:1: error: unspecified error
foo bar<EOF>
^
Hello! I would love to be able to disable highlighting on some systems (for example in terminals which does not support it). It would be great if we could just apply a "style" to our parser. Such style could contain color definitions and information if we want the colors or not.
Seems to be working fine w/ 0.6.1.1 for me, here.
I've been patching that with sed -ie '49s/6/7/' trifecta.cabal
in the Arch package: https://aur.archlinux.org/packages/ha/haskell-trifecta/PKGBUILD
The Cabal file prevents us from building Trifecta with recent versions of blaze-html
, blaze-markup
, hashable
, fingertree
, and comonad
in NixOS. The build succeeds fine, though, when those restrictions are simply dropped: https://github.com/NixOS/cabal2nix/blob/master/src/Cabal2Nix/PostProcess.hs#L221.
Could you please release a new version, eventually, that lifts those restrictions?
code:
data Expr
= Apply Expr Expr
| Lambda Char Expr
| Name Char
deriving (Show)
parseExpr:: Parser Expr
parseExpr = parseApply <|> parseLambda <|> parseName where
parseApply = do
f <- parseExpr
_ <- spaces
g <- parseExpr
pure $ Apply f g
parseLambda = do
_ <- char '\\'
v <- anyChar
_ <- char '.'
e <- parseExpr
pure $ Lambda v e
parseName = Name <$> anyChar
main :: IO ()
main = print $ parseString parseExpr mempty "a"
it would be preferable to either update the docs or link hackage instead.
@ekmett I relaxed the bounds on reducers, and it looks like parsers needs a new release from github for an allow newer clean build of trifecta to be doable
The current Monoid instance of the Trifecta parser is too trivial to be useful:
instance Monoid (Parser a) where
mappend = (<|>)
mempty = empty
A much more useful instance would be:
instance Monoid a => Monoid (Parser a) where
mappend = liftA2 mappend
mempty = pure mempty
This instance could, in fact, be added to the parsers library. Here's a simple example of use:
decimal :: Parser String
decimal = some digit <> option "" (string "." <> some digit)
Note that the instance becomes even more useful with some additional combinators, such as moptional, concatMany, and concatSome from the incremental-parser package. But that's an optional extension.
{-# LANGUAGE NoMonomorphismRestriction #-}
{-# LANGUAGE PackageImports #-}
{-# LANGUAGE OverloadedStrings #-}
module Trifecta where
import Text.Trifecta
import Text.Trifecta.Highlight.Prim
import Text.Trifecta.Parser.Token.Style
import Text.Trifecta.Parser.Identifier.Style
import Control.Applicative
import Control.Monad
import qualified "unordered-containers" Data.HashSet as HS
import Data.ByteString.Char8(ByteString, unpack)
import IPPrint
type Name = String
data AST
= Named Name AST
| Name Name
| Word Name
| String String
| List [AST]
deriving Show
pStream = many pElem
pElem =
pNamedObject
<|> pName
<|> pString
<|> pWord
<|> pList
pNonName = choice
[
pString
,pWord
,pList
]
lexName = unpack <$> lexeme (char '.' *> identifier)
pName = Name <$> lexName
pWord = Word <$> lexeme ((:) <$> alphaNum <*> many (alphaNum <|> oneOf "./-"))
pNamedObject = try $ Named <$> lexName <*> pNonName
pString = String <$> stringLiteral
pList = List <$> ((lexeme $ char '[') *> pStream <* (lexeme $ char ']'))
piqLanguage = LanguageDef piqComment piqIdent piqOp
where
piqComment = CommentStyle "" "" "%" False
piqIdent = IdentifierStyle
{ styleName = "identifier"
, styleStart = void letter
, styleLetter = void (alphaNum <|> char '-')
, styleReserved = HS.fromList ["true", "false"]
, styleHighlight = Identifier
, styleReservedHighlight = ReservedIdentifier
}
piqOp = emptyOps
parse = flip runLanguage piqLanguage (pStream <* eof)
test = pprint =<< parseFromFile parse "piqi-expanded.piqi"
blows up on the extended piqi file from:
trifecta requires 0.01 <= fingertree < 0.1 whereas reducers requires fingertree == 0.1.*
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.