Giter Club home page Giter Club logo

o365beat's People

Contributors

chris-counteractive avatar mahotosasaki avatar yanpeng7 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

o365beat's Issues

Parsing Extended Properties

Can any parsing be done on the client side for this? I found that a regex within Graylog to remove ,[\r\n]+ "Value": " and replace with :" and [\r\n]+ "Value": " with a replacement with " and ,[\r\n]+ " with replacement with :" converts it proper json to break out the fields.

Understanding Authentication Data

Hello,

This is more of a comment that I think anyone who is interested in collecting O365 Audit data should understand.

Authentication data is very very skewed because MS did not develop it for Audit purposes, but rather from a developer perspective.
There are 3 main categories of Authentication Operations:
UserLoggedIn
MailboxLogin
UserLoginFailed

Based on quickly looking at these, it would suggest that all failures fall under the UserLoginFailed, but from what I can tell, any Failure that is the RESULT of a user action being incorrect falls under UserLoginFailed but failed due to non-user specific controls can fall under UserLoggedIn. The difference here being between mistyping a password and using incorrect cached data, the first is a user error, the second is a system error.

Users should understand that LogonError is the key to determining if a UserLoggedIn operation is reporting a successful login or if it is just reporting an attempted login that went wrong. I have noticed as well that a LogonError is reported on Logouts as well. MS is not good at documenting this information and I will be posting to the forums about these findings.

I can understand why from a developer perspective things are reported in this way, as they are reported from a "System Tried to Take Response to Action, System Successfully Responded" but from an audit perspective, there is a lot to parse out here.

Elastic 7.6 allows for filtering on nested terms as well and ExtendedProperties is going to be your friend when validating successful vs unsuccessful login attempts.

I am working on a way to keep the original data the same in my beats system and modify it to be properly reported in the ECS event fields. I think it is crucial to report all the base MS information, but cleaning the data before leaving Beats will be important for not having costly Logstash pipelines down the road.

This is not a O365 Development Issue but rather an MS O365 API Development Issue I think everyone here should be aware of. MS needs to get their documentation in line with what it is reporting.

Killing the beat process from the command line is unreliable

When running the beat from the command line, ctrl-c does not reliably or quickly kill the process. The code for the beat’s main loop is all part of the libbeat framework, but it’s possible we’re doing something in our custom code that contributes to this problem. We've not seen this negatively impact the overall functioning of the beat, but it should be addressed if possible.

To work around this issue we recommend using the service commands, or killing the process using the PID from another terminal.

Incorrect file name on o365beat-1.4.3-x86_64.rpm.sha512

$ sha512sum -c o365beat-1.4.3-x86_64.rpm.sha512 
sha512sum: o365beat-7.4.0-x86_64.rpm: No such file or directory
o365beat-7.4.0-x86_64.rpm: FAILED open or read
sha512sum: WARNING: 1 listed file could not be read

Apparently, the file name should read o365beat-1.4.3-x86_64.rpm

object mapping for [ModifiedProperties]

i have this in beat's log
Nov 25 11:10:00 logstash logstash[103730]: [2019-11-25T11:10:00,543][WARN ][logstash.outputs.elasticsearch][main] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"o365-2019.11.25", :_type=>"_doc", :routing=>nil}, #<LogStash::Event:0x322ca0c8>], :response=>{"index"=>{"_index"=>"o365-2019.11.25", "_type"=>"_doc", "_id"=>"7jycoW4BRbr-kt3SOEWj", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"object mapping for [ModifiedProperties] tried to parse field [null] as object, but found a concrete value"}}}}
and i dont know, its a logs problem or beat?

Cannot index event

HI,

When I start o365beat I below error and I am not able view events in Kibana discover. Can you please let me know the reason for this issue and help me with solution?

2019-12-03T12:05:01.328Z WARN elasticsearch/client.go:535 Cannot index event publisher.Event{Content:beat.Event{Timestamp:time.Time{wall:0x0, ext:63710376320, loc:(*time.Location)(nil)}, Meta:common.MapStr(nil), Fields:common.MapStr{"Actor":[]interface {}{common.MapStr{"ID":"Device Registration Service", "Type":1}, common.MapStr{"ID":"01cb2876-7ebd-4aa4-9cc9-d28bd4d359a9", "Type":2}, common.MapStr{"ID":"ServicePrincipal_d7922889-6244-40de-9aa2-0c60ab7e7ae9", "Type":2}, common.MapStr{"ID":"d7922889-6244-40de-9aa2-0c60ab7e7ae9", "Type":2}, common.MapStr{"ID":"ServicePrincipal", "Type":2}}, "ActorContextId":"2ab446d6-6d34-4e72-a643-1634ef0758fd", "ActorIpAddress":"", "AzureActiveDirectoryEventType":1, "ClientIP":"", "CreationTime":"2019-11-26T14:45:20", "ExtendedProperties":[]interface {}{common.MapStr{"Name":"resultType", "Value":"Success"}, common.MapStr{"Name":"auditEventCategory", "Value":"Other"}, common.MapStr{"Name":"nCloud", "Value":""}, common.MapStr{"Name":"actorContextId", "Value":"2ab446d6-6d34-4e72-a643-1634ef0758fd"}, common.MapStr{"Name":"actorObjectId", "Value":"d7922889-6244-40de-9aa2-0c60ab7e7ae9"}, common.MapStr{"Name":"actorObjectClass", "Value":"ServicePrincipal"}, common.MapStr{"Name":"teamName", "Value":"MSODS."}, common.MapStr{"Name":"targetContextId", "Value":"2ab446d6-6d34-4e72-a643-1634ef0758fd"}, common.MapStr{"Name":"targetObjectId", "Value":"7d88f04c-05db-457a-b71e-d570a9d27bfe"}, common.MapStr{"Name":"extendedAuditEventCategory", "Value":"Device"}, common.MapStr{"Name":"targetName", "Value":"LAPTOP-G1RJLVQA"}, common.MapStr{"Name":"targetIncludedUpdatedProperties", "Value":"[]"}, common.MapStr{"Name":"correlationId", "Value":"fe1c3106-30b0-462a-a85e-eeb44b657e6d"}, common.MapStr{"Name":"version", "Value":"2"}, common.MapStr{"Name":"additionalDetails", "Value":"{}"}, common.MapStr{"Name":"env_ver", "Value":"2.1"}, common.MapStr{"Name":"env_name", "Value":"#Ifx.AuditSchema#IfxMsods.AuditCommonEvent"}, common.MapStr{"Name":"env_time", "Value":"2019-11-26T14:45:20.6592929Z"}, common.MapStr{"Name":"env_epoch", "Value":"7CQ0J"}, common.MapStr{"Name":"env_seqNum", "Value":"57710097"}, common.MapStr{"Name":"env_popSample", "Value":"0"}, common.MapStr{"Name":"env_iKey", "Value":"ikey"}, common.MapStr{"Name":"env_flags", "Value":"257"}, common.MapStr{"Name":"env_cv", "Value":"##01431934-caf7-42e8-ab93-ccdacdedcfab_00000000-0000-0000-0000-000000000000_01431934-caf7-42e8-ab93-ccdacdedcfab"}, common.MapStr{"Name":"env_os", "Value":""}, common.MapStr{"Name":"env_osVer", "Value":""}, common.MapStr{"Name":"env_appId", "Value":"restdirectoryservice"}, common.MapStr{"Name":"env_appVer", "Value":"1.0.11379.0"}, common.MapStr{"Name":"env_cloud_ver", "Value":"1.0"}, common.MapStr{"Name":"env_cloud_name", "Value":"MSO-BL2"}, common.MapStr{"Name":"env_cloud_role", "Value":"restdirectoryservice"}, common.MapStr{"Name":"env_cloud_roleVer", "Value":"1.0.11379.0"}, common.MapStr{"Name":"env_cloud_roleInstance", "Value":"BL2RDSR575"}, common.MapStr{"Name":"env_cloud_environment", "Value":"PROD"}, common.MapStr{"Name":"env_cloud_deploymentUnit", "Value":"R5"}}, "Id":"7a8bd47b-6492-4510-ab7a-0e2744499923", "ModifiedProperties":[]interface {}{common.MapStr{"Name":"Included Updated Properties", "NewValue":"", "OldValue":""}}, "ObjectId":"Not Available", "Operation":"Update device.", "OrganizationId":"2ab446d6-6d34-4e72-a643-1634ef0758fd", "RecordType":8, "ResultStatus":"Success", "SupportTicketId":"", "Target":[]interface {}{common.MapStr{"ID":"Device_7d88f04c-05db-457a-b71e-d570a9d27bfe", "Type":2}, common.MapStr{"ID":"7d88f04c-05db-457a-b71e-d570a9d27bfe", "Type":2}, common.MapStr{"ID":"Device", "Type":2}, common.MapStr{"ID":"LAPTOP-G1RJLVQA", "Type":1}}, "TargetContextId":"2ab446d6-6d34-4e72-a643-1634ef0758fd", "UserId":"ServicePrincipal_d7922889-6244-40de-9aa2-0c60ab7e7ae9", "UserKey":"Not Available", "UserType":4, "Version":1, "Workload":"AzureActiveDirectory", "agent":common.MapStr{"ephemeral_id":"dba1b10f-549e-422e-a6a8-ed123b4fb8b9", "hostname":"ip-1-0-0-171.ap-south-1.compute.internal", "id":"2b1c5393-9866-4e26-85f2-184ae5c17acf", "type":"o365beat", "version":"1.4.3"}, "cloud":common.MapStr{"account":common.MapStr{"id":"544851249924"}, "availability_zone":"ap-south-1c", "image":common.MapStr{"id":"ami-0ce933e2ae91880d3"}, "instance":common.MapStr{"id":"i-08a56a13d34ef21d3"}, "machine":common.MapStr{"type":"t3a.medium"}, "provider":"aws", "region":"ap-south-1"}, "ecs":common.MapStr{"version":"1.1.0"}, "host":common.MapStr{"architecture":"x86_64", "containerized":false, "hostname":"ip-1-0-0-171.ap-south-1.compute.internal", "id":"ec2eb2a972ed3b680c44e709588d1e20", "name":"ip-1-0-0-171.ap-south-1.compute.internal", "os":common.MapStr{"codename":"Karoo", "family":"redhat", "kernel":"4.14.152-127.182.amzn2.x86_64", "name":"Amazon Linux", "platform":"amzn", "version":"2"}}}, Private:interface {}(nil), TimeSeries:false}, Flags:0x0} (status=400): {"type":"mapper_parsing_exception","reason":"failed to parse field [ModifiedProperties] of type [keyword] in document with id 'u82my24Bg0YgRrVHQvjp'. Preview of field's value: '{OldValue=, NewValue=, Name=Included Updated Properties}'","caused_by":{"type":"illegal_state_exception","reason":"Can't get text on a START_OBJECT at 1:1643"}}

Many Thanks,
Madhu

[ModifiedProperties] Can't get text on a START_OBJECT

Have this in log:
Nov 26 11:29:12 logstash logstash[46377]: [2019-11-26T11:29:12,410][WARN ][logstash.outputs.elasticsearch][main] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"o365-2019.11.26", :_type=>"_doc", :routing=>nil}, #<LogStash::Event:0x52d5bf53>], :response=>{"index"=>{"_index"=>"o365-2019.11.26", "_type"=>"_doc", "_id"=>"-QrUpm4BlECIBFDuKB-8", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [ModifiedProperties] of type [text] in document with id '-QrUpm4BlECIBFDuKB-8'. Preview of field's value: '{NewValue=SharingLinks.7bv3e4a1-cfd3-77nb-5ac4-5fef5e7cbb87.OrganizationEdit.e64cw741-123d-5dff-7375-12e76345f0cy, Name=Name}'", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:1421"}}}}}
dont know, is is normal or not

Vendor field not correctly set, which caused failed tests during `make release`

The BeatVendor field isn't correctly set in the build configuration, so it defaults to "Elastic." This then causes a test case to fail in the make release process, because it expects the license to be different if the build is Elastic. Need to set the devtools.BeatVendor in the magefile, something the latest libbeat template does, to correct this issue. The builds are solid even with this error, but this should be corrected to make a clean build process for anyone forking this repo.

Incidentally, tried to fix this by setting the BEAT_VENDOR environment variable in the Makefile, but that didn't propagate.

future request Azure Beats

Hello,
excellent beat 👍 appreciate your work.

unfortunately i couldn't find events I'm looking for with o365beat cause they are in different API.

I'm not developer so maybe you can take a look to this 3 API:
https://docs.microsoft.com/en-us/azure/active-directory/identity-protection/howto-identity-protection-graph-api
and make AzureRiskBeat clonned from your 0365beat or sort of? Getting data look familiar with what you're doing... just differnnet API (again i'm not developer so this is only opinion)

Also I've seen logstash plugin related to Azure Event Hub. Also it's look similar with what you are doing with o365beat.
I'm personally prefer beat as a source of data and then send data to logstash.
Maybe than might be another AzureEventHubBeat?

Incorrect system time can lead to failed requests

The beat relies on the system time to calculate the spans for which it queries the API. If the system time is wrong in a particular way, it can cause the beat to ask the API for content outside the span of the allowed 7 day window, which kicks out errors.

Not sure this is a bug with the beat, but might be worth creating an error or warning if the system time and "API time" are different (or different enough). Consider pulling the current time from a source that matches the API (or the API itself?) and doing the datetime arithmetic using that.

o365 audit.exchange not returning threat audit logs

Lately, I have got a requirement from customer to integrate 0365 audit logs (exchange included) into
elasticsearch.

All was getting fine, until i couldn't logs related to e3 o365 subscription (exchange online-p2 plan) which has inbuilt-threat detection and block for spoofing, malware and url scanning.

None of those events are coming to exchange beside the admin audit activities of exchange related to use of cmdlets.

I want to know, is this the limitation of code or microsoft itself.
Thanks.

regards
asad

SIGABRT during `make release` packaging

An error occurs during make release, as shown below. This does not appear to break the build, but requires some research into its root cause. It was introduced after modifying the magefile to fix #9, but it still occurs even after removing that new code. It existed before and after the update to libbeat 7.5.1, and after a full re-pull of all the requisite docker images.

It does not happen when building the project locally (i.e., make vs. make release), it appears to occur during the packaging phase.

>> package: Building o365beat type=zip for platform=windows/amd64
>> package: Building o365beat type=rpm for platform=linux/amd64
>> package: Building o365beat type=tar.gz for platform=linux/amd64
>> package: Building o365beat type=deb for platform=linux/amd64
>> package: Building o365beat type=tar.gz for platform=darwin/amd64
>> package: Building o365beat type=tar.gz for platform=linux/386
>> package: Building o365beat type=deb for platform=linux/386
>> package: Building o365beat type=rpm for platform=linux/386
>> package: Building o365beat type=docker for platform=linux/amd64
>> package: Building o365beat type=zip for platform=windows/386
free(): invalid pointer
SIGABRT: abort
PC=0x7f5d1ab67e97 m=0 sigcode=18446744073709551610
signal arrived during cgo execution

goroutine 1 [syscall, locked to thread]:
runtime.cgocall(0x4afd50, 0xc420055cc0, 0xc420055ce8)
        /usr/lib/go-1.8/src/runtime/cgocall.go:131 +0xe2 fp=0xc420055c90 sp=0xc420055c50
github.com/docker/docker-credential-helpers/secretservice._Cfunc_free(0x110ada0)
        github.com/docker/docker-credential-helpers/secretservice/_obj/_cgo_gotypes.go:111 +0x41 fp=0xc420055cc0 sp=0xc420055c90
github.com/docker/docker-credential-helpers/secretservice.Secretservice.List.func5(0x110ada0)
        /build/golang-github-docker-docker-credential-helpers-cMhSy1/golang-github-docker-docker-credential-helpers-0.5.0/obj-x86_64-linux-gnu/src/github.com/docker/docker-credential-helpers/secretservice/secretservice_linux.go:96 +0x60 fp=0xc420055cf8 sp=0xc420055cc0
github.com/docker/docker-credential-helpers/secretservice.Secretservice.List(0x0, 0x756060, 0xc420016370)
        /build/golang-github-docker-docker-credential-helpers-cMhSy1/golang-github-docker-docker-credential-helpers-0.5.0/obj-x86_64-linux-gnu/src/github.com/docker/docker-credential-helpers/secretservice/secretservice_linux.go:97 +0x217 fp=0xc420055da0 sp=0xc420055cf8
github.com/docker/docker-credential-helpers/secretservice.(*Secretservice).List(0x77e548, 0xc420055e88, 0x410022, 0xc4200162d0)
        <autogenerated>:4 +0x46 fp=0xc420055de0 sp=0xc420055da0
github.com/docker/docker-credential-helpers/credentials.List(0x756ba0, 0x77e548, 0x7560e0, 0xc42000e018, 0x0, 0x10)
        /build/golang-github-docker-docker-credential-helpers-cMhSy1/golang-github-docker-docker-credential-helpers-0.5.0/obj-x86_64-linux-gnu/src/github.com/docker/docker-credential-helpers/credentials/credentials.go:145 +0x3e fp=0xc420055e68 sp=0xc420055de0
github.com/docker/docker-credential-helpers/credentials.HandleCommand(0x756ba0, 0x77e548, 0x7fff19bb54db, 0x4, 0x7560a0, 0xc42000e010, 0x7560e0, 0xc42000e018, 0x40e398, 0x4d35c0)
        /build/golang-github-docker-docker-credential-helpers-cMhSy1/golang-github-docker-docker-credential-helpers-0.5.0/obj-x86_64-linux-gnu/src/github.com/docker/docker-credential-helpers/credentials/credentials.go:60 +0x16d fp=0xc420055ed8 sp=0xc420055e68
github.com/docker/docker-credential-helpers/credentials.Serve(0x756ba0, 0x77e548)
        /build/golang-github-docker-docker-credential-helpers-cMhSy1/golang-github-docker-docker-credential-helpers-0.5.0/obj-x86_64-linux-gnu/src/github.com/docker/docker-credential-helpers/credentials/credentials.go:41 +0x1cb fp=0xc420055f58 sp=0xc420055ed8
main.main()
        /build/golang-github-docker-docker-credential-helpers-cMhSy1/golang-github-docker-docker-credential-helpers-0.5.0/secretservice/cmd/main_linux.go:9 +0x4f fp=0xc420055f88 sp=0xc420055f58
runtime.main()
        /usr/lib/go-1.8/src/runtime/proc.go:185 +0x20a fp=0xc420055fe0 sp=0xc420055f88
runtime.goexit()
        /usr/lib/go-1.8/src/runtime/asm_amd64.s:2197 +0x1 fp=0xc420055fe8 sp=0xc420055fe0

goroutine 17 [syscall, locked to thread]:
runtime.goexit()
        /usr/lib/go-1.8/src/runtime/asm_amd64.s:2197 +0x1

User Data Enhancement

Just a suggestion:
In the o365beat.yml file, it might be prudent to change UserKey to the user id field.
UserId in O365 relates to the user.email and user.name (Typically)

My suggestion is:

  • dissect:
    field: UserId
    tokenizer: '[%{user.name}@%{user.domain}]'
    when:
    contains:
    UserId: '@'

convert:

  • {from: UserKey, to: 'user.id', type: 'string'}

This is for UserId directly to ECS (as I believe that is the primary case for this beats) but it could be modified by adding UserName and UserDomain as fields then converting to ECS in the conversion processor.
I am still new to ELK so I am not sure how to modify the mappings. I have been doing these in Logstash but it is clear they would fit here.

Live Realoding Credentials

Hello,
First of all great job guys with the beat!

I would like to ask if there is a possibility to add Live Reload feature. Something similar to this.

filebeat.config.inputs:
  enabled: true
  path: configs/*.yml
  reload.enabled: true
  reload.period: 10s

What I would like to achieve is creating external confile with credentials, and be able to change it without restarting the beat.
If this is achievable, are you planning to add this ?

Response body not printed for authentication errors which hinders debugging.

Debugging and error information for authentication issues should be just as verbose as during API calls. The most common error during auth is an incorrect client secret, but the current debugging output won't show those details, it just says ERROR instance/beat.go:877 Exiting: non-200 status during auth. and the response object, but not the response body which has the good stuff in it.

Fix is to match the API call debug message and include the body.

o365beat-registry.json?

Hi,

What am I supposed to put for registry_file_path: ${O365BEAT_REGISTRY_PATH:./o365beat-registry.json}?

Thanks,
Andrew

Docker Instructions

I would like to run this beat as a docker container, however I don't see any instructions for doing so. Is this something that is currently support?

AzureActiveDirectory Logs not pulled

I have been using o365beat to pull in logs successfully from 3 different tenants for the last couple of months. As of the 1st of November no AzureActiveDirectory logs have been pulled. I have checked the logs, o365beat.txt and the config file,
o365beat.yml.txt and cannot find an error.

Publisher ID is not set

When using o365beat I receive the error:

{"code":"AF429","message":"Too many requests. Method=GetContents, PublisherId=00000000-0000-0000-0000-000000000000"}

It looks like Publisher ID is not set from the config file from directory_id so the request get rate limited with the other 00000 etc Publisher IDs being used. I believe Publisher ID can be arbitrary from other Azure AD API scripts I have used.

My current config looks like this (hard coded values):
tenant_domain: removed
client_secret: removed
client_id: removed # aka application id (GUID)
directory_id: removed # aka tenant id (GUID)
registry_file_path: ./o365beat.state

I have tried to set the directory as an environment variable as well.

Thanks for any help.

ECS filter doesn't correctly parse ClientIP fields that have unusual format (like in Exchange events)

Certain event types (primarily from Exchange, it seems) store IP and port information in the ClientIP field, in the following format: [192.168.0.1]:12345. The converter plugin as currently configured ({from: "ClientIP", to: "client.ip", type: ip}) doesn't handle that format, so those events end up with no client.ip ECS field (and filters acting on that, like perhaps logstash geoip, won't fire).

Preventing Duplicate Events

We've noticed duplication of events and we're looking at ways to prevent them. I tried adding the add_id processor, but it's not available in the list of processors.

`make release` names package files with libbeat version rather than beat version

The packages generated during make release are labeled with the libbeat version (lately, 7.4.0) rather than the beat version (lately, 1.4.1). The version in the actual beat itself, that makes it into the event metadata, is correct. I think this has to do with how environment variables are (or are not) passed to the docker containers used in the build process, but for now I've just been renaming the packages before upload with something like

for f in o365beat-7.4.0-*; do mv "$f" "$(echo "$f" | sed s/7\.4\.0/1\.4\.1/)"; done

Low priority, but worth fixing at some point.

401 Response

I configured the yml file like was told but keep getting a 401 (Authorization Denied) when running. Any help with this?

Question regarding installing other modules on o365beat

Hello,

Once again I have a question about what beat utilities are available. In my use case I am using beats to pull cloud logs. I am curious if it is possible to install filebeat modules on o365beat and use them as additional inputs, or is the architecture of the beat different enough that this is not possible. Once again I apologize if this questions seems straightforward, still learning about the inner-workings of beats.

Beat doesn't detect when auditing is disabled (and docs don't emphasize it's necessary)

This beat requires that auditing be enabled in the Office 365 tenancy. This is separate from the subscriptions being enabled for the API, and is not currently documented in o365beat's README. This should be clearly shown, with screens, the same as the app API permissions instructions.

Also, when auditing is disabled the API kicks back a particular error message: Microsoft.Office.Compliance.Audit.DataServiceException: Tenant <tenantID> does not exist. The beat should detect this and give a more helpful, actionable error message.

Certificate signed by unknown authority message

Hi - I am having an issue when attempting to run o365beat within a docker container if you could provide some assistance? At the point where it is running o365beat and attempting to authenticate with O365, it is saying that the certificate is signed by an unknown authority which is weird as when I inspected the cert it is signed with DigiCert:

image

This is my Dockerfile:

FROM ubuntu:latest
ADD https://github.com/counteractive/o365beat/releases/download/v1.5.1/o365beat-1.5.1-amd64.deb /tmp/

RUN  apt-get install /tmp/o365beat-1.5.1-amd64.deb

COPY o365beat.yml /etc/o365beat/
RUN chmod go-w /etc/o365beat/o365beat.yml
CMD [ "/usr/bin/o365beat" ]

Let me know if you need any further information, and thank you in advance for any help.

ClientIP real-world values don't match ip data type in fields.yml, causing errors in elasticsearch parsing

Certain real-world values of the ClientIP field don't match the ip data type currently defined in fields.yml, leading to parsing errors.

For example, the bracketed IP format like [2222:2222:222:2222::2]:22222 or null leads to the following parsing error in elasticsearch:

(status=400): {"type":"mapper_parsing_exception","reason":"failed to parse field [ClientIP] of type [ip] in document with id '<ID>'. Preview of field's value: '[2222:2222:222:2222::2]:22222'","caused_by
":{"type":"illegal_argument_exception","reason":"'[2222:2222:222:2222::2]:22222' is not an IP string literal."}}

The simplest fix is to define ClientIP as a string, then do some better parsing when converting that to the ECS field client.ip.

Log content changed recently?

Has the contents of the Office 365 Audit logs changed recently?

I noticed that since last month I'm no longer seeing location data e.g. origin country of login in logs anymore.

Has anyone else noticed this, or is it something on my end?

Thanks,
Gen.

Exiting: error loading config file: yaml: line 2: did not find expected node content

Hi,

I'm getting the error:

Exiting: error loading config file: yaml: line 2: did not find expected node content

When I try to start the o365beat.

# o365beat --path.config /etc/o365beat -c o365beat.yml -e -d "*" --strict.perms=false

o365beat:
  tenant_domain: company.onmicrosoft.com
  client_secret: {secret}
  client_id:     {id}  # aka application id (GUID)
  directory_id:  {id}  # aka tenant id (GUID)
  registry_file_path: /etc/o365beat/o365beat.state
  content_types:
    - Audit.AzureActiveDirectory
    - Audit.Exchange
    - Audit.SharePoint
    - Audit.General
processors:
  - add_fields:
    fields:
      tenant: {company}
  - dissect:
      field: ClientIP
      tokenizer: '[%{clientip}]:%{clientport}'
      when:
        contains:
          ClientIP: '['
  - dissect:
      field: ClientIP
      tokenizer: '%{clientip}:%{clientport}'
      when:
        contains:
          ClientIP: ':'
        not:
          contains:
            ClientIP: '['
  - convert:
      fields:
        - {from: Id, to: 'event.id', type: string}                # ecs core
        - {from: RecordType, to: 'event.code', type: string}      # ecs extended
        - {from: Operation, to: 'event.action', type: string}     # ecs core
        - {from: OrganizationId, to: 'cloud.account.id', type: string} # ecs extended
        - {from: Workload, to: 'event.category', type: string}    # ecs core
        - {from: ResultStatus, to: 'event.outcome', type: string} # ecs extended
        - {from: UserId, to: 'user.id', type: string}             # ecs core
        - {from: ClientIP, to: 'client.ip', type: ip}             # ecs core
        - {from: 'dissect.clientip', to: 'client.ip', type: ip}   # ecs core
        - {from: Parameters, type: string}                        # no ecs mapping
        - {from: ExtendedProperties, type: string}                # no ecs mapping
        - {from: ModifiedProperties, type: string}                # no ecs mapping
      ignore_missing: true
      fail_on_error: false
      mode: copy # default
fields:
  tenant: {company}
setup.kibana:
cloud.id: "{clusterId}"
cloud.auth: "{clusterAuth}"

Index pattern doesn't change

Configuration:

setup.ilm.enabled: false
setup.template.enabled: true
setup.template.name: "testbeat-%{[agent.version]}"
setup.template.pattern: "testbeat-%{[agent.version]}-*"

output.elasticsearch:
  index: "testbeat-%{[agent.version]}-%{+yyyy.MM.dd}"

Expected result:
The name of new index pattern is testbeat-%{[agent.version]}-%{+yyyy.MM.dd}

Result:
It's still using the o365beat-%{[agent.version]}-%{+yyyy.MM.dd} index pattern

Certain events don't trigger the dissect processor

Some events (e.g. "CrmDefaultActivity") don't trigger the dissect processor which triggers the "client.ip" field because the "ClientIP" data is in the format of "192.168.1.1:80" rather than "[192.168.1.1]:80".

Managed to fix this with an extra dissect processor entry:

  - dissect:
      field: ClientIP
      tokenizer: '%{clientip}:%{clientport}'
      when:
        contains:
          ClientIP: ':'
        not:
          contains:
            ClientIP: '['

Logstash connection errors

Good afternoon,

I've recently started using a Logstash output rather than ElasticSearch for the o365beat. Logstash is on the same host as the beat instance and Logstash forwards the logs onto ElasticSearch, as well as another output.

Since this configuration change I'm getting regular errors in the system log as follows:

o365beat[98265]: 2020-05-12T12:16:16.451+0100        ERROR        logstash/async.go:256        Failed to publish events caused by: write tcp [::1]:32906->[::1]:5045: write: connection reset by peer
o365beat[98265]: 2020-05-12T12:21:13.851+0100        ERROR        pipeline/output.go:121        Failed to publish events: write tcp 127.0.0.1:35260->127.0.0.1:5045: write: connection reset by peer
o365beat[98265]: 2020-05-12T12:21:13.851+0100        INFO        pipeline/output.go:95        Connecting to backoff(async(tcp://localhost:5045))
o365beat[98265]: 2020-05-12T12:21:13.852+0100        INFO        pipeline/output.go:105        Connection to backoff(async(tcp://localhost:5045)) established

I'm still gettin office 365 logs from the beat in ElasticSearch and it doesn't look like any logs are missing but I can't be sure.

Any idea on what I can check on why this error is coming up?

Thanks,
Gen.

visualization not working

O365 Beat version 1.5.0

When I open any of the visualizations I get a message such as below.

Saved object is missing

Could not locate that index-pattern (id: ceb2f990-f469-11e9-9f4d-5dd1f9c9e483),

Also the O365 Dashboard opens but all the panes say

Could not locate that index-pattern (id: ceb2f990-f469-11e9-9f4d-5dd1f9c9e483), click here to re-create it

Any Ideas? I am new to Elasticsearch.

Thanks

Auto-subscribe functionality doesn't work with empty subscription list

If you've never subscribed to a particular management activity API content type before, the listSubcriptions function can return an empty list (or a list without that particular content type), rather than a list with the content type "disabled". This meant that the auto-subscribe logic, which looped over the listSubscriptions results, would not actually subscribe you unless you'd been previously subscribed.

Deployed a tentative fix in 3b8e38b, once it's tested I'll issue a new release.

Error parsing Audit.AzureActiveDirectory (other content types work fine)

Dec 10 07:58:45 testhost o365beat[11770]: 2019-12-10T07:58:45.130Z#011WARN#011elasticsearch/client.go:535#011Cannot index event publisher.Event{Content:beat.Event{Timestamp:time.Time{wall:0x0, ext:63711558741, loc:(*time.Location)(nil)}, Meta:common.MapStr(nil), Fields:common.MapStr{"Actor":[]interface {}{common.MapStr{"ID":"6f7dc456-1e61-4b2a-a913-4d3d69d30570", "Type":0}, common.MapStr{"ID":"[email protected]", "Type":5}, common.MapStr{"ID":"10037FFE91234567", "Type":3}}, "ActorContextId":"9383ac17-fa44-43b4-b883-6ac627ac89ed", "ActorIpAddress":"10.10.10.10", "ApplicationId":"5e3ce6c0-2b1f-4285-8d4b-75ee78787346", "AzureActiveDirectoryEventType":1, "ClientIP":"10.10.10.10", "CreationTime":"2019-12-10T07:12:21", "ExtendedProperties":"[{\"Name\":\"UserAgent\",\"Value\":\"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/78.0.3904.108 Safari/537.36\"} {\"Name\":\"UserAuthenticationMethod\",\"Value\":\"9\"} {\"Name\":\"RequestType\",\"Value\":\"OAuth2:Authorize\"} {\"Name\":\"ResultStatusDetail\",\"Value\":\"Redirect\"} {\"Name\":\"KeepMeSignedIn\",\"Value\":\"False\"}]", "Id":"07aa789d-5426-4f10-8318-8dfc6288e1d8", "InterSystemsId":"3e90d4d1-956e-45ed-b56e-f57d5d763138", "IntraSystemId":"c4fae13d-1a46-42e8-b733-2f900d914102", "ModifiedProperties":"[]", "ObjectId":"4580fd1d-e5a3-4f56-9ad1-aab0e3bf8f76", "Operation":"UserLoggedIn", "OrganizationId":"8196ac17-fa44-43b4-b883-6ac627ac67ca", "RecordType":15, "ResultStatus":"Succeeded", "SupportTicketId":"", "Target":[]interface {}{common.MapStr{"ID":"1234cd1d-e5a3-4f56-9ad1-bba0e3bf8f76", "Type":0}}, "TargetContextId":"8196ac17-fa44-43b4-b883-6ac627ac67ca", "UserId":"[email protected]", "UserKey":"[email protected]", "UserType":0, "Version":1, "Workload":"AzureActiveDirectory", "agent":common.MapStr{"ephemeral_id":"12348684-45a5-486f-8e66-8330a6ef4567", "hostname":"testhost", "id":"03c971b4-cd4d-4d7e-a4b9-c24b5d6f56b5", "type":"o365beat", "version":"1.4.3"}, "client":common.MapStr{"ip":"10.10.10.10"}, "cloud":common.MapStr{"account":common.MapStr{"id":"8196ac17-fa44-43b4-b883-6ac627ac67ca"}}, "ecs":common.MapStr{"version":"1.1.0"}, "event":common.MapStr{"action":"UserLoggedIn", "category":"AzureActiveDirectory", "code":"15", "id":"099a789d-5426-4f10-8318-8dfc6277e1d8", "outcome":"Succeeded"}, "host":common.MapStr{"name":"testhost"}, "user":common.MapStr{"id":"[email protected]"}}, Private:interface {}(nil), TimeSeries:false}, Flags:0x0} (status=400): {"type":"mapper_parsing_exception","reason":"object mapping for [ExtendedProperties] tried to parse field [ExtendedProperties] as object, but found a concrete value"}

ModifiedProperties field is sent as object which causes problems for elasticsearch

ModifiedProperties shows up as a field in some workloads and causes the same parsing problems as Parameters and ExtendedProperties. Because it's not a true set of key-value pairs, it will require more involved parsing, which should only be done when needed.

Fix will be the same as those other object fields: process them into a string which can then be deserialized or parsed if necessary, and avoids parsing error.

WARN beater/o365beat.go:249 start XX must be <=YY hrs ago, resetting

Hi,
could anyone please explain me what happens in such case?
This is the scenario:
Assume I'm asking all events prior to YY hours, let's say 24h and now are 8:00:00
o365beats pulls a request with start time 8:00 of yesterday.
But the response arrive at 8:05:00 five minute later the request (my tenant is really busy)
Then I've got this warning. I suppose because an event starting at 8:00:00 of yesterday it's older than 8:05:00 minus 24h.
Do I loss all the events or only the events older than 24h?
Because I'm having a lot of this warning and a huge number of duplicate Id, and I'm guessing if it's related to this case.
Thank you for clarifyng me,
Roberto

Error Parsing @timestamp

Hiya,

Noticed since i did a update to 7.4.2 i'm now getting these messages:

Nov 28 08:49:39 elkserver o365beat[94399]: 2019-11-28T08:49:39.177Z ERROR instance/beat.go:877 Exiting: parsing time "" as "2006-01-02T15:04:05Z07:00": cannot parse "" as "2006"

Tried adjusting the :
# - {from: "CreationTime", to: "", type: ""} # @timestamp
to
- {from: "CreationTime", to: "", type: string} # @timestamp
in order to try and get something through but it still fails with same error.
Have you seen this before ?

Client.Timeout for Exchange/General/Sharepoint

Hello, I'm attempting to use o365beat and I am able to get events from Audit.AzureActiveDirectory but any of the other three all fail with a client.timeout. Am I wrong to assume that the audit logging is turned on since I am able to retrieve AzureAD logs?

2020-02-25T15:14:18.368-0600	DEBUG	[api]	beater/o365beat.go:279	getting next page of results from NextPageUri: https://manage.office.com/api/v1.0/{{tenantID}}/activity/feed/subscriptions/content?contenttype=Audit.General&endtime=2020-02-20T10:46:42.0000000Z&starttime=2020-02-19T10:46:42.0000000Z&nextpage=20200220001806300068349$20200219104642000000000$na0022
2020-02-25T15:14:18.369-0600	DEBUG	[api]	beater/o365beat.go:112	issuing api request: https://manage.office.com/api/v1.0/{{tenantID}}/activity/feed/subscriptions/content?PublisherIdentifier={{tenantID}}&contenttype=Audit.General&endtime=2020-02-20T10%3A46%3A42.0000000Z&nextpage=20200220001806300068349%2420200219104642000000000%24na0022&starttime=2020-02-19T10%3A46%3A42.0000000Z
2020-02-25T15:14:19.954-0600	DEBUG	[api]	beater/o365beat.go:279	getting next page of results from NextPageUri: https://manage.office.com/api/v1.0/{{tenantID}}/activity/feed/subscriptions/content?contenttype=Audit.General&endtime=2020-02-20T10:46:42.0000000Z&starttime=2020-02-19T10:46:42.0000000Z&nextpage=20200220003621012087066$20200219104642000000000$na0022
2020-02-25T15:14:19.954-0600	DEBUG	[api]	beater/o365beat.go:112	issuing api request: https://manage.office.com/api/v1.0/{{tenantID}}/activity/feed/subscriptions/content?PublisherIdentifier={{tenantID}}&contenttype=Audit.General&endtime=2020-02-20T10%3A46%3A42.0000000Z&nextpage=20200220003621012087066%2420200219104642000000000%24na0022&starttime=2020-02-19T10%3A46%3A42.0000000Z
2020-02-25T15:14:21.769-0600	DEBUG	[api]	beater/o365beat.go:279	getting next page of results from NextPageUri: https://manage.office.com/api/v1.0/{{tenantID}}/activity/feed/subscriptions/content?contenttype=Audit.General&endtime=2020-02-20T10:46:42.0000000Z&starttime=2020-02-19T10:46:42.0000000Z&nextpage=20200220010537625168694$20200219104642000000000$na0022
2020-02-25T15:14:21.769-0600	DEBUG	[api]	beater/o365beat.go:112	issuing api request: https://manage.office.com/api/v1.0/{{tenantID}}/activity/feed/subscriptions/content?PublisherIdentifier={{tenantID}}&contenttype=Audit.General&endtime=2020-02-20T10%3A46%3A42.0000000Z&nextpage=20200220010537625168694%2420200219104642000000000%24na0022&starttime=2020-02-19T10%3A46%3A42.0000000Z
2020-02-25T15:14:22.763-0600	INFO	[monitoring]	log/log.go:145	Non-zero metrics in the last 30s	{"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":135,"time":{"ms":13}},"total":{"ticks":453,"time":{"ms":43},"value":453},"user":{"ticks":318,"time":{"ms":30}}},"info":{"ephemeral_id":"8a86dd31-aa3c-40d8-aeab-d3f7fbfe87b2","uptime":{"ms":240065}},"memstats":{"gc_next":7404752,"memory_alloc":5387160,"memory_total":43785384,"rss":20480},"runtime":{"goroutines":13}},"libbeat":{"config":{"module":{"running":0}},"pipeline":{"clients":1,"events":{"active":0}}},"system":{"load":{"1":3.6611,"15":3.6367,"5":3.6147,"norm":{"1":0.4576,"15":0.4546,"5":0.4518}}}}}}
2020-02-25T15:14:51.775-0600	INFO	[monitoring]	log/log.go:153	Total non-zero metrics	{"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":137,"time":{"ms":137}},"total":{"ticks":459,"time":{"ms":459},"value":459},"user":{"ticks":322,"time":{"ms":322}}},"info":{"ephemeral_id":"8a86dd31-aa3c-40d8-aeab-d3f7fbfe87b2","uptime":{"ms":269073}},"memstats":{"gc_next":7404752,"memory_alloc":6479216,"memory_total":44877440,"rss":32579584},"runtime":{"goroutines":10}},"libbeat":{"config":{"module":{"running":0}},"output":{"type":"elasticsearch"},"pipeline":{"clients":1,"events":{"active":0}}},"system":{"cpu":{"cores":8},"load":{"1":3.6987,"15":3.6426,"5":3.6328,"norm":{"1":0.4623,"15":0.4553,"5":0.4541}}}}}}
2020-02-25T15:14:51.775-0600	INFO	[monitoring]	log/log.go:154	Uptime: 4m29.077678404s
2020-02-25T15:14:51.776-0600	INFO	[monitoring]	log/log.go:131	Stopping metrics logging.
2020-02-25T15:14:51.776-0600	INFO	instance/beat.go:435	o365beat stopped.
2020-02-25T15:14:51.776-0600	ERROR	instance/beat.go:916	Exiting: error listing all available content between 2020-02-19 10:46:42 +0000 UTC and 2020-02-25 15:10:25.407664 -0600 CST m=+2.740111375: Get https://manage.office.com/api/v1.0/{{tenantID}}/activity/feed/subscriptions/content?PublisherIdentifier={{tenantID}}&contenttype=Audit.General&endtime=2020-02-20T10%3A46%3A42.0000000Z&nextpage=20200220010537625168694%2420200219104642000000000%24na0022&starttime=2020-02-19T10%3A46%3A42.0000000Z: net/http: request canceled (Client.Timeout exceeded while awaiting headers)
Exiting: error listing all available content between 2020-02-19 10:46:42 +0000 UTC and 2020-02-25 15:10:25.407664 -0600 CST m=+2.740111375: Get https://manage.office.com/api/v1.0/{{tenantID}}/activity/feed/subscriptions/content?PublisherIdentifier={{tenantID}}&contenttype=Audit.General&endtime=2020-02-20T10%3A46%3A42.0000000Z&nextpage=20200220010537625168694%2420200219104642000000000%24na0022&starttime=2020-02-19T10%3A46%3A42.0000000Z: net/http: request canceled (Client.Timeout exceeded while awaiting headers)

2020-02-25T16:09:02.210-0600	DEBUG	[api]	beater/o365beat.go:279	getting next page of results from NextPageUri: https://manage.office.com/api/v1.0/{{tenantID}}/activity/feed/subscriptions/content?contenttype=Audit.Exchange&endtime=2020-02-20T10:46:42.0000000Z&starttime=2020-02-19T10:46:42.0000000Z&nextpage=20200220003934923146674$20200219104642000000000$na0022
2020-02-25T16:09:02.210-0600	DEBUG	[api]	beater/o365beat.go:112	issuing api request: https://manage.office.com/api/v1.0/{{tenantID}}/activity/feed/subscriptions/content?PublisherIdentifier={{tenantID}}&contenttype=Audit.Exchange&endtime=2020-02-20T10%3A46%3A42.0000000Z&nextpage=20200220003934923146674%2420200219104642000000000%24na0022&starttime=2020-02-19T10%3A46%3A42.0000000Z
2020-02-25T16:09:18.069-0600	INFO	[monitoring]	log/log.go:145	Non-zero metrics in the last 30s	{"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":232,"time":{"ms":10}},"total":{"ticks":784,"time":{"ms":35},"value":784},"user":{"ticks":552,"time":{"ms":25}}},"info":{"ephemeral_id":"aa261288-f9d2-473e-a654-3afc0da61359","uptime":{"ms":420083}},"memstats":{"gc_next":8027264,"memory_alloc":5188720,"memory_total":77670584,"rss":540672},"runtime":{"goroutines":13}},"libbeat":{"config":{"module":{"running":0}},"pipeline":{"clients":1,"events":{"active":0}}},"system":{"load":{"1":3.25,"15":3.2871,"5":3.2778,"norm":{"1":0.4063,"15":0.4109,"5":0.4097}}}}}}
2020-02-25T16:09:32.218-0600	INFO	[monitoring]	log/log.go:153	Total non-zero metrics	{"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":234,"time":{"ms":234}},"total":{"ticks":790,"time":{"ms":791},"value":790},"user":{"ticks":556,"time":{"ms":557}}},"info":{"ephemeral_id":"aa261288-f9d2-473e-a654-3afc0da61359","uptime":{"ms":434231}},"memstats":{"gc_next":8027264,"memory_alloc":4852192,"memory_total":78240840,"rss":33533952},"runtime":{"goroutines":10}},"libbeat":{"config":{"module":{"running":0}},"output":{"type":"elasticsearch"},"pipeline":{"clients":1,"events":{"active":0}}},"system":{"cpu":{"cores":8},"load":{"1":3.9014,"15":3.334,"5":3.4126,"norm":{"1":0.4877,"15":0.4167,"5":0.4266}}}}}}
2020-02-25T16:09:32.218-0600	INFO	[monitoring]	log/log.go:154	Uptime: 7m14.232693213s
2020-02-25T16:09:32.218-0600	INFO	[monitoring]	log/log.go:131	Stopping metrics logging.
2020-02-25T16:09:32.218-0600	INFO	instance/beat.go:435	o365beat stopped.
2020-02-25T16:09:32.219-0600	ERROR	instance/beat.go:916	Exiting: error listing all available content between 2020-02-19 10:46:42 +0000 UTC and 2020-02-25 16:02:18.876632 -0600 CST m=+0.940801257: Get https://manage.office.com/api/v1.0/{{tenantID}}/activity/feed/subscriptions/content?PublisherIdentifier={{tenantID}}&contenttype=Audit.Exchange&endtime=2020-02-20T10%3A46%3A42.0000000Z&nextpage=20200220003934923146674%2420200219104642000000000%24na0022&starttime=2020-02-19T10%3A46%3A42.0000000Z: net/http: request canceled (Client.Timeout exceeded while awaiting headers)
Exiting: error listing all available content between 2020-02-19 10:46:42 +0000 UTC and 2020-02-25 16:02:18.876632 -0600 CST m=+0.940801257: Get https://manage.office.com/api/v1.0/{{tenantID}}/activity/feed/subscriptions/content?PublisherIdentifier={{tenantID}}&contenttype=Audit.Exchange&endtime=2020-02-20T10%3A46%3A42.0000000Z&nextpage=20200220003934923146674%2420200219104642000000000%24na0022&starttime=2020-02-19T10%3A46%3A42.0000000Z: net/http: request canceled (Client.Timeout exceeded while awaiting headers)

How to use debugging output enabled on Centos7

Hello, i'm using this beat on a Centos7 machine, i'm trying to use the command ./o365beat --path.config . -c o365beat.yml -e -d "*" but it doesn't works, i tried on the /etc/o365beat directory, out of it and even the root directory, I installed the beat from the rpm package. How can I make it work?

GCC High - Endpoint Used Returning no data

I don't see an option in the settings to change the endpoint API URLs. When attempting to use this for GCC high environments it will not work and returns empty data due to the wrong endpoint being targeted for the API.

Endpoint Mapping information can be found here:
https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/active-directory/fundamentals/whats-new-microsoft-365-government.md

https://docs.microsoft.com/en-us/office365/enterprise/office-365-u-s-government-gcc-high-endpoints

beat keystore possibiity

Hello,

First let me say great work putting this beat together. Very excited to utilize it. I am new to the world of beat dev so I ask a humble question. Other Elastic Beats have keystores where secrets can be kept. Would it be possible to either use an existing keystore of another beat, or perhaps at a later date a keystore could be added to o365beat?

ClientIP not parsed to ECS client.ip when it has port but no brackets

Microsoft's API creates events with the ClientIP field in one of (at least) three different formats:

  1. 10.10.10.10
  2. [10.10.10.10]:10100
  3. 10.10.10.10:10100

We handled the first two, but this third format (with a port but no brackets) is not handled by the current processors in o365beat.yml. Fix should be something like adding the following to current o365beat.yml, but still needs testing:

processors:
  - dissect:
      field: ClientIP
      tokenizer: '%{clientip}:%{clientport}'
      when:
        contains:
          ClientIP: ':'
        not:
          contains:
            ClientIP: '['

Config template "processors" section shadows custom ECS processors

As noted by @SecBear, the default config file contains a processors section that gets merged into the o365beat.yml and shadows the custom ECS processors. To fix that, you have to manually remove the template section, or merge the two. From my reply to @SecBear's PR:

This is definitely true: the second processor section "shadows" the first, and has to be removed or merged before use. The problem is, when building the beat the build tools actually create o365beat.yml dynamically by merging _meta/beat.yml with a config file template within the libbeat framework (libbeat/_meta/config.yml.tmpl). It's that .tmpl file that has the extra processor section, and I haven't had time to figure out how to suppress the inclusion of that section. Instead I do what you did in the PR, which is combine them or remove the second processors section altogether.

Unfortunately, if I merge this, it'll be clobbered by the build for the next release and I'll have to re-insert it by hand, which may be the best bet until we sort out a durable fix. Any thoughts on how to suppress the inclusion of the processors section from the libbeat template would be much appreciated! Or, if there's a smarter way to think about the issue, I'm happy to hear that too.

Thanks again for the contribution, I really appreciate the engagement, I'm sure we can sort out a long-term fix for what is definitely a real issue and inconvenience.

Originally posted by @chris-counteractive in #7 (comment)

Dashboard and visualizations not working - error with fields.keyword

We've been using o365 beat v1.4.0 for sometime and have had no issues with the API connection or pulling data from Azure tenant before. Recently we wanted to try the dashboards and visualizations available with the latest version , and so we upgraded to v 1.5.1.

I'm still able to see the content blobs being pulled and the data written to elastic nodes , but the dashboards and visualizations show errors continuously at each step. First , I got this error

*** Could not locate that index-pattern (id: ceb2f990-f469-11e9-9f4d-5dd1f9c9e483), click here to re-create it

I realized it could be that the index-pattern ID might not have reflected properly , and I edited the saved searches ( o365 alerts | o365 unique users (logins) | o365 client ips | o365 user actions ) to reflect the index-pattern ID , index refname and name as o365beat-* ( as this seems to be the custom ID for the index-pattern 0365beat-* that was automatically created when o365 beat was run for the first time ). Still not sure why it was not able to automatically reference the right index , but atleast after this step this error was gone.

Now I'm getting this error on each of these visualizations :

For visualiztion = o365 unique users (logins)
Could not locate that index-pattern-field (id: user.id.keyword)

For visualiztion = o365 user actions
Could not locate that index-pattern-field (id: user.id.keyword)

For visualiztion = o365 client ips
Could not locate that index-pattern-field (id: client.ip.keyword)

Question about possibility of monitoring multiple tenant domains

For my current project I am required to pull logs from multiple tenant domains and output each to a separate index. My current solution is running an instance of o365beat per domain that I am pulling logs from. Just curious if I can somehow configure the beat to pull from each domain and use conditionals to send the output to the different indices. In my experience with beats in the past this was done with multiple prospectors, but not sure if that is possible with o365beat. If it is not I will just continue running multiple instances.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.