Giter Club home page Giter Club logo

isi_darma's People

Contributors

apoorva1225 avatar basrizk avatar chenk7166 avatar darpan-jain avatar jackstudycode avatar jonmay avatar maksimstw avatar parsahejabi avatar shuailiu6626 avatar thammegowda avatar timothyswang avatar wise-east avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

Forkers

tongtianli

isi_darma's Issues

Server name mismatch error

I'm getting the following error when running python -m boteval example-chat-task -b /boteval:

/Users/jcho/miniconda3/envs/boteval/lib/python3.9/site-packages/flask/app.py:1781: UserWarning: Current server name '0.0.0.0:7070' doesn't match configured server name 'dev.gowda.ai'

I think it's because of this line in the config file where the server name is set to dev.gowda.ai.

I've changed it to localhost and I get the same error.
If I change it to 0.0.0.0:7070, it seems to work, but I think it's hacky to do it this way. Is there a better fix?

websockets.exceptions.ConnectionClosedError: no close frame received or sent

2022-08-15 21:13:10,772][mephisto.operations.operator][INFO] - Operator running task ID = 4
[2022-08-15 21:13:26,844][mephisto.abstractions.architects.channels.websocket_channel][ERROR] - Socket logged error: no close frame received or sent
Traceback (most recent call last):
File "/home/darma/.conda/envs/darma/lib/python3.7/site-packages/websockets/legacy/protocol.py", line 945, in transfer_data
message = await self.read_message()
File "/home/darma/.conda/envs/darma/lib/python3.7/site-packages/websockets/legacy/protocol.py", line 1015, in read_message
frame = await self.read_data_frame(max_size=self.max_size)
File "/home/darma/.conda/envs/darma/lib/python3.7/site-packages/websockets/legacy/protocol.py", line 1090, in read_data_frame
frame = await self.read_frame(max_size)
File "/home/darma/.conda/envs/darma/lib/python3.7/site-packages/websockets/legacy/protocol.py", line 1149, in read_frame
extensions=self.extensions,
File "/home/darma/.conda/envs/darma/lib/python3.7/site-packages/websockets/legacy/framing.py", line 70, in read
data = await reader(2)
File "/home/darma/.conda/envs/darma/lib/python3.7/asyncio/streams.py", line 677, in readexactly
raise IncompleteReadError(incomplete, n)
asyncio.streams.IncompleteReadError: 0 bytes read on a total of 2 expected bytes

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/home/darma/.conda/envs/darma/lib/python3.7/site-packages/mephisto/abstractions/architects/channels/websocket_channel.py", line 140, in run_socket
message = await websocket.recv()
File "/home/darma/.conda/envs/darma/lib/python3.7/site-packages/websockets/legacy/protocol.py", line 553, in recv
await self.ensure_open()
File "/home/darma/.conda/envs/darma/lib/python3.7/site-packages/websockets/legacy/protocol.py", line 921, in ensure_open
raise self.connection_closed_exc()
websockets.exceptions.ConnectionClosedError: no close frame received or sent
[2022-08-15 21:13:34,463][mephisto.operations.client_io_handler][INFO] - Sending alive
[2022-08-15 21:13:34,463][mephisto.abstractions.architects.channels.websocket_channel][INFO] - channel open

Reorganize: move "src" into a subdir with appropriate name

We need to reorg the files in repo:
"ISI Darma" has many moduls, (online protocol, offline protocol, cralwers etc)

Move setup.cfg, requirements.txt, src at the root of repo needs to be moved to a subdir with a suitable sub module name
Similar to darma_chat.

OpenAI GPT API sometimes raise exception saying its busy

When this happens our front end freezes. :(

[2022-08-25 14:29:59,638][openai][INFO] - message='OpenAI API response' path=https://api.openai.com/v1/completions processing_ms=19705 response_code=429
[2022-08-25 14:29:59,638][openai][INFO] - error_code=None error_message='The server is currently overloaded with other requests. Sorry about that! You can retry your request, or contact [email protected] if the error persists.' error_param=None error_type=server_error message='OpenAI API error received' stream_error=False
[2022-08-25 14:29:59,639][mephisto.abstractions._subcomponents.task_runner][ERROR] - Unhandled exception in unit SandboxMTurkUnit(5033, 3LG268AV375A8Q9RGHRFU278QQVREE, assigned)
Traceback (most recent call last):
  File "/Users/tg/opt/miniconda3/envs/darma/lib/python3.9/site-packages/mephisto/abstractions/_subcomponents/task_runner.py", line 270, in _launch_and_run_unit
    self.run_unit(unit, agent)
  File "/Users/tg/opt/miniconda3/envs/darma/lib/python3.9/site-packages/mephisto/abstractions/blueprints/parlai_chat/parlai_chat_task_runner.py", line 278, in run_unit
    world.parley()
  File "/Users/tg/work/projects/darma/isi_darma/darma_chat/darma_chat/worlds.py", line 193, in parley
    self._run_initial_turn()
  File "/Users/tg/work/projects/darma/isi_darma/darma_chat/darma_chat/worlds.py", line 472, in _run_initial_turn
    first_bot_act = self.bot.act()
  File "/Users/tg/work/projects/darma/isi_darma/darma_chat/darma_chat/gpt_agent.py", line 42, in act
    resp = self.query_completion_api(p, engine=self.engine)
  File "/Users/tg/work/projects/darma/isi_darma/darma_chat/darma_chat/gpt_agent.py", line 64, in query_completion_api
    response = openai.Completion.create(
  File "/Users/tg/opt/miniconda3/envs/darma/lib/python3.9/site-packages/openai/api_resources/completion.py", line 31, in create
    return super().create(*args, **kwargs)
  File "/Users/tg/opt/miniconda3/envs/darma/lib/python3.9/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 100, in create
    response, _, api_key = requestor.request(
  File "/Users/tg/opt/miniconda3/envs/darma/lib/python3.9/site-packages/openai/api_requestor.py", line 122, in request
    resp, got_stream = self._interpret_response(result, stream)
  File "/Users/tg/opt/miniconda3/envs/darma/lib/python3.9/site-packages/openai/api_requestor.py", line 329, in _interpret_response
    self._interpret_response_line(
  File "/Users/tg/opt/miniconda3/envs/darma/lib/python3.9/site-packages/openai/api_requestor.py", line 362, in _interpret_response_line
    raise self.handle_error_response(
openai.error.RateLimitError: The server is currently overloaded with other requests. Sorry about that! You can retry your request, or contact [email protected] if the error persists.
[2022-08-25 14:30:22,276][mephisto.operations.operator][INFO] - Operator running task ID = 156

darma_chat get_assignment raises Exception to some chats

ERROR:chat_admin.app:Exception on /review/20220904_235301_860_live [GET]
Traceback (most recent call last):
  File "/home/darma/.conda/envs/darma/lib/python3.7/site-packages/flask/app.py", line 2447, in wsgi_app
    response = self.full_dispatch_request()
  File "/home/darma/.conda/envs/darma/lib/python3.7/site-packages/flask/app.py", line 1952, in full_dispatch_request
    rv = self.handle_user_exception(e)
  File "/home/darma/.conda/envs/darma/lib/python3.7/site-packages/flask/app.py", line 1821, in handle_user_exception
    reraise(exc_type, exc_value, tb)
  File "/home/darma/.conda/envs/darma/lib/python3.7/site-packages/flask/_compat.py", line 39, in reraise
    raise value
  File "/home/darma/.conda/envs/darma/lib/python3.7/site-packages/flask/app.py", line 1950, in full_dispatch_request
    rv = self.dispatch_request()
  File "/home/darma/.conda/envs/darma/lib/python3.7/site-packages/flask/app.py", line 1936, in dispatch_request
    return self.view_functions[rule.endpoint](**req.view_args)
  File "/home/darma/work/isi_darma/chat_admin/chat_admin/app.py", line 340, in review_chat
    assignment = mturk.get_assignment(assignment_id=mturk_asgn_id)
  File "/home/darma/work/isi_darma/chat_admin/chat_admin/app.py", line 158, in get_assignment
    return self.client.get_assignment(AssignmentId=assignment_id)['Assignment']
  File "/home/darma/.conda/envs/darma/lib/python3.7/site-packages/botocore/client.py", line 508, in _api_call
    return self._make_api_call(operation_name, kwargs)
  File "/home/darma/.conda/envs/darma/lib/python3.7/site-packages/botocore/client.py", line 915, in _make_api_call
    raise error_class(parsed_response, operation_name)
botocore.errorfactory.RequestError: An error occurred (RequestError) when calling the GetAssignment operation: This operation can be called with a status of: Reviewable,Approved,Rejected (1662438083742 s)

toxic offender shouldn't have to be "B"

If the conversation block looks like:

        "conversation": 
            [
                {
                    "speaker_id": "X", 
                    "text": "This is a sample conversation."
                }, 
                {
                    "speaker_id": "Y", 
                    "text": "That's what you think, I'm having a real discussion over there."
                },
                {
                    "speaker_id": "X", 
                    "text": "What time does Godot get here?"
                },
           ],
        "target_user": "X"
    }

all turns will be on the left, in yellow. If "X" is replaced by "B" then B turns will appear blue.

OpenAI exception while live demo

Traceback (most recent call last):                                                                                                                                                            │
  File "/home/darma/.conda/envs/darma/lib/python3.7/site-packages/urllib3/connectionpool.py", line 710, in urlopen                                                                            │
    chunked=chunked,                                                                                                                                                                          │
  File "/home/darma/.conda/envs/darma/lib/python3.7/site-packages/urllib3/connectionpool.py", line 449, in _make_request                                                                      │
    six.raise_from(e, None)                                                                                                                                                                   │
  File "<string>", line 3, in raise_from                                                                                                                                                      │
  File "/home/darma/.conda/envs/darma/lib/python3.7/site-packages/urllib3/connectionpool.py", line 444, in _make_request                                                                      │
    httplib_response = conn.getresponse()                                                                                                                                                     │
  File "/home/darma/.conda/envs/darma/lib/python3.7/http/client.py", line 1373, in getresponse                                                                                                │
    response.begin()                                                                                                                                                                          │
  File "/home/darma/.conda/envs/darma/lib/python3.7/http/client.py", line 319, in begin                                                                                                       │
    version, status, reason = self._read_status()                                                                                                                                             │
  File "/home/darma/.conda/envs/darma/lib/python3.7/http/client.py", line 288, in _read_status                                                                                                │
    raise RemoteDisconnected("Remote end closed connection without"                                                                                                                           │
http.client.RemoteDisconnected: Remote end closed connection without response                                                                                                                 │
                                                                                                                                                                                              │
During handling of the above exception, another exception occurred:                                                                                                                           │
                                                                                                                                                                                              │
Traceback (most recent call last):                                                                                                                                                            │
  File "/home/darma/.conda/envs/darma/lib/python3.7/site-packages/requests/adapters.py", line 499, in send                                                                                    │
    timeout=timeout,                                                                                                                                                                          │
  File "/home/darma/.conda/envs/darma/lib/python3.7/site-packages/urllib3/connectionpool.py", line 788, in urlopen                                                                            │
    method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]                                                                                                                           │
  File "/home/darma/.conda/envs/darma/lib/python3.7/site-packages/urllib3/util/retry.py", line 550, in increment                                                                              │
    raise six.reraise(type(error), error, _stacktrace)                                                                                                                                        │
  File "/home/darma/.conda/envs/darma/lib/python3.7/site-packages/urllib3/packages/six.py", line 769, in reraise                                                                              │
    raise value.with_traceback(tb)                                                                                                                                                            │
  File "/home/darma/.conda/envs/darma/lib/python3.7/site-packages/urllib3/connectionpool.py", line 710, in urlopen                                                                            │
    chunked=chunked,                                                                                                                                                                          │
  File "/home/darma/.conda/envs/darma/lib/python3.7/site-packages/urllib3/connectionpool.py", line 449, in _make_request                                                                      │
    six.raise_from(e, None)                                                                                                                                                                   │
  File "<string>", line 3, in raise_from                                                                                                                                                      │
  File "/home/darma/.conda/envs/darma/lib/python3.7/site-packages/urllib3/connectionpool.py", line 444, in _make_request                                                                      │
    httplib_response = conn.getresponse()                                                                                                                                                     │
  File "/home/darma/.conda/envs/darma/lib/python3.7/http/client.py", line 1373, in getresponse                                                                                                │
    response.begin()                                                                                                                                                                          │
  File "/home/darma/.conda/envs/darma/lib/python3.7/http/client.py", line 319, in begin                                                                                                       │
    version, status, reason = self._read_status()                                                                                                                                             │
  File "/home/darma/.conda/envs/darma/lib/python3.7/http/client.py", line 288, in _read_status                                                                                                │
    raise RemoteDisconnected("Remote end closed connection without"                                                                                                                           │
urllib3.exceptions.ProtocolError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))


The above exception was the direct cause of the following exception:                                                                                                                          │
                                                                                                                                                                                              │
Traceback (most recent call last):                                                                                                                                                            │
  File "/home/darma/.conda/envs/darma/lib/python3.7/site-packages/mephisto/abstractions/_subcomponents/task_runner.py", line 270, in _launch_and_run_unit                                     │
    self.run_unit(unit, agent)                                                                                                                                                                │
  File "/home/darma/.conda/envs/darma/lib/python3.7/site-packages/mephisto/abstractions/blueprints/parlai_chat/parlai_chat_task_runner.py", line 278, in run_unit                             │
    world.parley()                                                                                                                                                                            │
  File "/home/darma/work/isi_darma/darma_chat/darma_chat/worlds.py", line 205, in parley                                                                                                      │
    acts[idx] = agent.act(timeout=self.max_resp_time)                                                                                                                                         │
  File "/home/darma/work/isi_darma/darma_chat/darma_chat/gpt_agent.py", line 44, in act                                                                                                       │
    resp = self.query_completion_api(p, engine=self.engine, frequency_penalty=2, presence_penalty=2, temperature=1)                                                                           │
  File "/home/darma/work/isi_darma/darma_chat/darma_chat/gpt_agent.py", line 72, in query_completion_api                                                                                      │
    stop=["user A:", "user B:", "user C:", "user D:"]                                                                                                                                         │
  File "/home/darma/.conda/envs/darma/lib/python3.7/site-packages/openai/api_resources/completion.py", line 31, in create                                                                     │
    return super().create(*args, **kwargs)                                                                                                                                                    │
  File "/home/darma/.conda/envs/darma/lib/python3.7/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 106, in create                                                  │
    request_id=request_id,                                                                                                                                                                    │
  File "/home/darma/.conda/envs/darma/lib/python3.7/site-packages/openai/api_requestor.py", line 120, in request                                                                              │
    request_id=request_id,                                                                                                                                                                    │
  File "/home/darma/.conda/envs/darma/lib/python3.7/site-packages/openai/api_requestor.py", line 302, in request_raw                                                                          │
    raise error.APIConnectionError("Error communicating with OpenAI") from e                                                                                                                  │
openai.error.APIConnectionError: Error communicating with OpenAI

online protocol opt out

if a user replies "opt out" to the bot then it won't reply even if that user is being toxic any more in the thread (local whitelist)
if a user sends "opt out" to the bot pm then it will never reply if that user is being toxic in any thread (global whitelist)

info message when darma service is shut down

sudo systemctl start isi_darma.service makes a nice "system starting" message to the logs.

Similarly, would be nice to see a shutdown message to confirm the system is down when it's brought down normally.

openai version incompatible with other packages

The updated requirements file with the openai version gets
ERROR: ResolutionImpossible: for help visit https://pip.pypa.io/en/latest/topics/dependency-resolution/#dealing-with-dependency-conflicts when running pip install -e . in the darma_chat directory.

Temporarily commenting it out works.

survey questions don't really match survey answers and BOT should be renamed MODERATOR

It's not completely clear that BOT is "the moderator" (the instructions say so but I'd like there to be a closer connection), plus we may do human studies so let's rename BOT to MODERATOR.

The last two questions asked of the offline bot: "Did the moderator understand your point of view" and "Did the moderator convince you to change your behavior" aren't answered well by "very" or "so-so"

I suggest changing "very" to "yes", "Somewhat" to "Mostly", and "So-so" to "Somewhat". Then given the answers "Yes", "Mostly". "Somewhat", "Mostly not", and "Not at all" i think the following four questions (note slight rewrites) will be all answerable by those choices:

"Was the moderator coherent?"
"Was the moderator responsive to what you wrote?"
"Did the moderator understand your point of view?"
"Did the moderator convince you to change your behavior?"

rtg hangs if bombarded

when sending a lot of MT data to RTG api it will hang with no error message and need to be restarted.

darma_chat: Error while submitting survey results

Server log

[2022-08-11 22:57:54,613][root][INFO] - [agent 1] self.task_turn_idx: {self.task_turn_idx}, self.dialog is: {self.dialog}
[2022-08-11 22:57:54,613][root][INFO] - ModelChatWorld:conversation_id 83: About to act with task turn idx: {self.task_turn_idx}
[2022-08-11 22:58:06,254][root][ERROR] - MT API Error:: Expecting value: line 1 column 1 (char 0). Returning source.

[2022-08-11 22:58:06,254][root][INFO] - Got act for agent idx 0, act was: {'text': '', 'task_data': {'final_rating': 'Very|Very|Somewhat|Somewhat'}, 'id': 'A', 'episode_done': False, 'update_id': '9b508831-f960-40da-84bc-8e623970269f', 'timestamp': 1660283886.161773} and self.task_turn_idx: {self.task_turn_idx}.
[2022-08-11 22:58:06,261][mephisto.abstractions._subcomponents.task_runner][ERROR] - Unhandled exception in unit MockUnit(4904, assigned)
Traceback (most recent call last):
  File "/Users/tg/opt/miniconda3/envs/darma/lib/python3.9/site-packages/mephisto/abstractions/_subcomponents/task_runner.py", line 270, in _launch_and_run_unit
    self.run_unit(unit, agent)
  File "/Users/tg/opt/miniconda3/envs/darma/lib/python3.9/site-packages/mephisto/abstractions/blueprints/parlai_chat/parlai_chat_task_runner.py", line 278, in run_unit
    world.parley()
  File "/Users/tg/work/projects/darma/isi_darma/darma_chat/darma_chat/worlds.py", line 250, in parley
    self.final_chat_data = self.get_final_chat_data()
  File "/Users/tg/work/projects/darma/isi_darma/darma_chat/darma_chat/worlds.py", line 496, in get_final_chat_data
    data = super().get_final_chat_data()
  File "/Users/tg/work/projects/darma/isi_darma/darma_chat/darma_chat/worlds.py", line 364, in get_final_chat_data
    'model_file': self.bot.model_agent.opt.get('model_file'),
AttributeError: 'NoneType' object has no attribute 'opt'
[2022-08-11 22:58:06,263][root][INFO] - Runs completed per model: blender_90M: 1

javascript logs:
Screen Shot 2022-08-11 at 11 03 32 PM

turn pair counting not appropriately reflected in code

When the seed dialogue is 2x the length of the number of turns to run no turns are offered to the user, even though the logging doesn't indicate this will happen.

setting:

partial log:

Choosing the "blender_90M" model for the bot.
Creating ModelChatWorld for tag conversation_id 53 with 6 turns.
ModelChatWorld:conversation_id 53: is at turn 0, with 6 pairs of turns needed...
Use custom dialogue seeds to start conversations with some context
Context info: {'conversation': [{'speaker_id': 's1', 'text': 't1'}, {'speaker_id': 's2', 'text': 't2'}, {'speaker_id': 's3', 'text': 't3'}, {'speaker_id': 's4', 'text': 't4'}, {'speaker_id': 's5', 'text': 't5'}, {'speaker_id': 's6', 'text': 't6'}, {'speaker_id': 's7', 'text': 't7'}, {'speaker_id': 's8', 'text': 't8'}, {'speaker_id': 's9', 'text': 't9'}, {'speaker_id': 's10', 'text': 't10'}, {'speaker_id': 's11', 'text': 't11'}, {'speaker_id': 's12', 'text': 't12'}], 'target_user': 's12'}

(skipped 12 statements like:)

17:54:19 | TurkLikeAgent: In observe() before semaphore, self.turn_idx is 0 and observation is {'episode_done': False, 'id': 's1', 'text': 't1', 'fake_start': True, 'agent_idx': 1, 'update_id': '71de604b-0666-47f1-b6dc-e781cbbd533e'}
17:54:19 | TurkLikeAgent: In observe() AFTER semaphore, self.turn_idx: 0, observation["text"]: t1

...

ModelChatWorld:conversation_id 53: is at turn 1, with 6 pairs of turns needed...
ModelChatWorld:conversation_id 53: About to act with task turn idx: 1

but the gui skips directly to conversation evaluation.

I was using this test conversation:

    {
        "conversation":
        [
            {
                "speaker_id": "s1",
                "text": "t1"
            },
            {
                "speaker_id": "s2",
                "text": "t2"
            },
            {
                "speaker_id": "s3",
                "text": "t3"
            },
            {
                "speaker_id": "s4",
                "text": "t4"
            },
            {
                "speaker_id": "s5",
                "text": "t5"
            },
            {
                "speaker_id": "s6",
                "text": "t6"
            },
                   {
                "speaker_id": "s7",
                "text": "t7"
                   },
                   {
                "speaker_id": "s8",
                "text": "t8"
                   },
                   {
                "speaker_id": "s9",
                "text": "t9"
                   },
                               {
                "speaker_id": "s10",
                "text": "t10"
                               },
                               {
                "speaker_id": "s11",
                "text": "t11"
                               },
                               {
                "speaker_id": "s12",
                "text": "t12"
            }
        ],
        "target_user": "s12"
    }

Darma Chat Admin Dashboard: Review, Pay, Advance (/qualify) turkers

Create a simple dashboard to present data stored in model_chat directory.

MVP:

  • Review
    • Load all completed HIT assignments
    • Review a conversation and the submitted response (read only mode)
  • Payment
    • Figure out if already paid that user for that assignment
    • One click button to pay iff not already paid
    • If the chat ran longer, figure out how to much extra to pay,
  • Qualification
    • Advance user to Darma-level-2 by giving them that necessary qualification
  • Security / Access control
    • Have some basic authentication

Good to have

  • Charts, plots: Any cool visualizations you want to build
  • average time taken
  • unique users
  • avg HITs by user
  • activity over time etc

unable to install requirements for chat admin

following directions:

(darma) [jonmay@jonmay chat_admin (main)]$ pip install -e .
Obtaining file:///Users/jonmay/projects/civil_sanctuary/isi_darma/chat_admin
  Preparing metadata (setup.py) ... error
  error: subprocess-exited-with-error

  × python setup.py egg_info did not run successfully.
  │ exit code: 1
  ╰─> [2 lines of output]
      packages:: ['chat_admin', 'chat_admin.static', 'chat_admin.templates', 'chat_admin.static.css', 'chat_admin.static.js', 'chat_admin.static.img']
      error in chat-admin setup command: 'install_requires' must be a string or list of strings containing valid project/version requirement specifiers; Parse error at "'=5.2'": Expected string_end
      [end of output]

  note: This error originates from a subprocess, and is likely not a problem with pip.
error: metadata-generation-failed

× Encountered error while generating package metadata.
╰─> See above for output.

note: This is an issue with the package mentioned above, not pip.
hint: See above for details.
(darma) [jonmay@jonmay chat_admin (main)]$ which python
/Users/jonmay/miniconda3/envs/darma/bin/python
(darma) [jonmay@jonmay chat_admin (main)]$ python --version
Python 3.8.13

parlai mturk task: TypeError: Object of type _AgentStateMetadata is not JSON serializable

I tried to run offline protocol on mturk sandbox. testing the current HEAD darma branch, and I got this exception

[2022-07-19 16:10:18,706][asyncio][ERROR] - Task exception was never retrieved
future: <Task finished name='Task-152' coro=<WorkerPool.register_worker() done, defined at /Users/tg/work/projects/darma/Mephisto/mephisto/operations/worker_pool.py:143> exception=TypeError('Object of type _AgentStateMetadata is not JSON serializable')>
Traceback (most recent call last):
  File "/Users/tg/work/projects/darma/Mephisto/mephisto/operations/worker_pool.py", line 185, in register_worker
    await self.register_agent(crowd_data, worker, request_id)
  File "/Users/tg/work/projects/darma/Mephisto/mephisto/operations/worker_pool.py", line 515, in register_agent
    onboard_agent.state.set_init_state(onboard_data)
  File "/Users/tg/work/projects/darma/Mephisto/mephisto/abstractions/_subcomponents/agent_state.py", line 179, in set_init_state
    self.save_data()
  File "/Users/tg/work/projects/darma/Mephisto/mephisto/abstractions/_subcomponents/agent_state.py", line 234, in save_data
    self._save_data()
  File "/Users/tg/work/projects/darma/Mephisto/mephisto/abstractions/blueprints/parlai_chat/parlai_chat_agent_state.py", line 109, in _save_data
    self.agent.db.write_dict(agent_file, self.get_data())
  File "/Users/tg/work/projects/darma/Mephisto/mephisto/abstractions/databases/local_database.py", line 1483, in write_dict
    json.dump(target_dict, data_file)
  File "/Users/tg/opt/miniconda3/envs/darma/lib/python3.9/json/__init__.py", line 179, in dump
    for chunk in iterable:
  File "/Users/tg/opt/miniconda3/envs/darma/lib/python3.9/json/encoder.py", line 431, in _iterencode
    yield from _iterencode_dict(o, _current_indent_level)
  File "/Users/tg/opt/miniconda3/envs/darma/lib/python3.9/json/encoder.py", line 405, in _iterencode_dict
    yield from chunks
  File "/Users/tg/opt/miniconda3/envs/darma/lib/python3.9/json/encoder.py", line 438, in _iterencode
    o = _default(o)
  File "/Users/tg/opt/miniconda3/envs/darma/lib/python3.9/json/encoder.py", line 179, in default
    raise TypeError(f'Object of type {o.__class__.__name__} '
TypeError: Object of type _AgentStateMetadata is not JSON serializable 

offline protocol workflow design and flow

Require the ICD signature and attestation before the rest of the test, and don't allow it to be submitted until that part is filled in.

Include another link to the task instructions in the "make sure you've read the task instructions" part.

After the quiz, simply have them submit the HIT. Include some language saying that "We will review your responses and then contact you for more interactive dialogues." (Let's pay them $0.50 for filling out the form.)

Give people who pass this a qualification to do 2 dialogues. If they pass that we give them the general qualification to do many more dialogues.

offline protocol instructions notes

Instructions for Turkers -> Instructions
conversation on the right -> conversation below
speaker you're trying to copy -> speaker you're trying to imitate
response the relates -> response that relates

darma_chat: clean up URLs

Some of the chat threads have URLs. remove them.
If chat text is URL only, then drop/truncate that chat.

boteval: GPT bot errors

here is Jon's feedback on slack

Screen Shot 2022-10-17 at 12 23 12 PM

Take aways and TODOs

  1. truncate context
    self.sturns += f"\nuser {msg['user_id']}: {msg['text']}"

openai.error.InvalidRequestError: This model's maximum context length is 4097 tokens, however you requested 4199 tokens (3175 in your prompt; 1024 for the completion). Please reduce your prompt; or completion length.

  1. Looks like the context is not correctly handled in GPT bot

user C says You make me want to die. Useful idiots of racism. Keep looking elsewhere. and the response is You're welcome. We are committed to creating a more civil online environment, and we will act accordingly. Thanks for letting us know.

  1. Dont let the user can submit without interacting with rating sliders

template response should be to addressee

current behavior:
a: [statement]
b: [toxic statement of type t]
darma_bot: It looks like you're [t]. You used this kind of behavior in response to others. I feel upset because even though I don’t know others, the language you used when communicating with them would make me upset.

preferred behavior:
darma_bot: It looks like you're [t]. You used this kind of behavior in response to [a]. I feel upset because even though I don’t know [a], the language you used when communicating with them would make me upset.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.