Giter Club home page Giter Club logo

cb-dragonfly's People

Contributors

dependabot[bot] avatar dev-secloudit avatar devpjh121 avatar inno-cloudbarista avatar jihoon-seo avatar jin-whee-park avatar jmleefree avatar jongwooo avatar movey1 avatar pjhmong avatar pjini avatar seokho-son avatar sunyeongchoi avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

cb-dragonfly's Issues

Dragonfly 오래 켜 놓으면 디스크 용량을 많이 차지하는 문제

Dragonfly 를 오래 켜 놓았더니 디스크 용량을 많이 차지하는 문제가 있습니다.

docker system df -v

Containers space usage:

CONTAINER ID        IMAGE                                        COMMAND                  LOCAL VOLUMES       SIZE                CREATED             STATUS              NAMES
95647e702259        cloudbaristaorg/cb-webtool:v0.1-20200326     "/bin/sh -c 'reflex …"   0                   6.31MB              3 days ago          Up 20 hours         cb-webtool
89b039b1471e        cloudbaristaorg/cb-tumblebug:v0.1.3          "/app/src/cb-tumbleb…"   0                   0B                  3 days ago          Up 20 hours         cb-tumblebug
d4b6a9446610        cloudbaristaorg/cb-spider:v0.1.3             "/root/go/src/github…"   0                   1.48kB              3 days ago          Up 20 hours         cb-spider
0dbfad1f54b9        cloudbaristaorg/cb-restapigw:v0.1-20200408   "/app/cb-restapigw -…"   0                   0B                  3 days ago          Up 20 hours         cb-restapigw
40ee7674a43b        grafana/grafana                              "/run.sh"                0                   0B                  3 days ago          Up 20 hours         cb-restapigw-grafana
25502302b5cf        cloudbaristaorg/cb-dragonfly:v0.1-20200422   "./wait-for-it-wrapp…"   0                   54.2GB              3 days ago          Up 20 hours         cb-dragonfly
eabc58e922ad        bitnami/etcd:latest                          "/entrypoint.sh etcd"    0                   128MB               3 days ago          Up 20 hours         cb-dragonfly-etcd
3ea1e1a445d0        influxdb:latest                              "/entrypoint.sh infl…"   1                   0B                  3 days ago          Up 20 hours         cb-dragonfly-influxdb
f04f7b19b579        influxdb:latest                              "/entrypoint.sh infl…"   0                   0B                  3 days ago          Up 20 hours         cb-restapigw-influxdb
b8251e0a6454        jaegertracing/all-in-one:latest              "/go/bin/all-in-one-…"   1                   0B                  3 days ago          Up 20 hours         cb-restapigw-jaeger

TODO: Open port for CB-Dragonfly gRPC server

K8s namespace "dragonfly" is hardcoded in DF

What happened
:
cloud-barista/cb-operator#172 (comment)

제가 cb-operator에 있는 Cloud-Barista Helm chart를 실행해 봤는데,
DF 파드가 계속 CrashLoopBackOff 상태네요..

NAME                                                     READY   STATUS             RESTARTS   AGE
cb-dragonfly-5c6cfdb965-d4hl9                            0/1     CrashLoopBackOff   6          13m
cb-dragonfly-influxdb-0                                  1/1     Running            0          13m
cb-dragonfly-kafka-0                                     1/1     Running            1          13m
cb-dragonfly-zookeeper-0                                 1/1     Running            0          13m
cb-mcks-64fbc88ff9-498zv                                 1/1     Running            0          13m
cb-restapigw-786c5789fb-6wcht                            1/1     Running            0          13m
cb-restapigw-influxdb-8494b7858b-x9f5n                   1/1     Running            0          13m
cb-restapigw-jaeger-agent-fw9kz                          1/1     Running            0          13m
cb-restapigw-jaeger-cassandra-schema-zhxfz               0/1     Completed          0          13m
cb-restapigw-jaeger-collector-5955dd767f-mt5d7           1/1     Running            2          13m
cb-restapigw-jaeger-query-547fd7844-5mxpc                2/2     Running            5          13m
cb-restapigw-readonly-85f699b77b-2dnq7                   1/1     Running            0          13m
cb-spider-748cfff964-bklx2                               1/1     Running            0          13m
cb-tumblebug-f9d7dfc64-rv7m5                             1/1     Running            0          13m
cb-tumblebug-mapui-854b567558-wrrt6                      1/1     Running            0          13m
cb-webtool-994b7bc66-qslhh                               1/1     Running            0          13m
cloud-barista-cassandra-0                                1/1     Running            0          13m
cloud-barista-cassandra-1                                1/1     Running            0          10m
cloud-barista-cassandra-2                                1/1     Running            0          8m52s
cloud-barista-dragonfly-kapacitor-5cb786f778-wctj2       1/1     Running            1          13m
cloud-barista-etcd-0                                     1/1     Running            0          13m
cloud-barista-grafana-5fd6c7d4b-tkzdr                    2/2     Running            0          13m
cloud-barista-kube-state-metrics-599fc9945-8nkmm         1/1     Running            0          13m
cloud-barista-prometheus-alertmanager-5f66fd95f5-ch5tx   2/2     Running            0          13m
cloud-barista-prometheus-node-exporter-tqmbw             1/1     Running            0          13m
cloud-barista-prometheus-pushgateway-764b656db6-7r5qb    1/1     Running            0          13m
cloud-barista-prometheus-server-656458bcb9-6tgjv         2/2     Running            0          13m
docker-registry-6f96999dd6-n6b82                         1/1     Running            0          13m

DF 파드의 로그는 다음과 같습니다.

❯ kubectl logs cb-dragonfly-5c6cfdb965-jdzv4 -n cloud-barista
time="2021-11-25T11:43:30+09:00" level=info msg="create tick file with name default.tick"
[CLOUD-BARISTA].[INFO]: 2021-11-25 11:43:30 etcd-driver.go:102, github.com/cloud-barista/cb-store/store-drivers/etcd-driver.(*ETCDDriver).Put() - Key:/monitoring/configs/agent_interval, value:2 
[CLOUD-BARISTA].[INFO]: 2021-11-25 11:43:30 etcd-driver.go:102, github.com/cloud-barista/cb-store/store-drivers/etcd-driver.(*ETCDDriver).Put() - Key:/monitoring/configs/collector_interval, value:10 
[CLOUD-BARISTA].[INFO]: 2021-11-25 11:43:30 etcd-driver.go:102, github.com/cloud-barista/cb-store/store-drivers/etcd-driver.(*ETCDDriver).Put() - Key:/monitoring/configs/max_host_count, value:5 
[CLOUD-BARISTA].[INFO]: 2021-11-25 11:43:30 etcd-driver.go:102, github.com/cloud-barista/cb-store/store-drivers/etcd-driver.(*ETCDDriver).Put() - Key:/monitoring/configs/puller_interval, value:10 
[CLOUD-BARISTA].[INFO]: 2021-11-25 11:43:30 etcd-driver.go:102, github.com/cloud-barista/cb-store/store-drivers/etcd-driver.(*ETCDDriver).Put() - Key:/monitoring/configs/puller_aggregate_interval, value:30 
[CLOUD-BARISTA].[INFO]: 2021-11-25 11:43:30 etcd-driver.go:102, github.com/cloud-barista/cb-store/store-drivers/etcd-driver.(*ETCDDriver).Put() - Key:/monitoring/configs/aggregate_type, value:avg 
[CLOUD-BARISTA].[INFO]: 2021-11-25 11:43:30 etcd-driver.go:102, github.com/cloud-barista/cb-store/store-drivers/etcd-driver.(*ETCDDriver).Put() - Key:/monitoring/configs/deploy_type, value:helm 
[CLOUD-BARISTA].[INFO]: 2021-11-25 11:43:30 etcd-driver.go:102, github.com/cloud-barista/cb-store/store-drivers/etcd-driver.(*ETCDDriver).Put() - Key:/monitoring/configs/monitoring_policy, value:agentCount 
[CLOUD-BARISTA].[INFO]: 2021-11-25 11:43:30 etcd-driver.go:102, github.com/cloud-barista/cb-store/store-drivers/etcd-driver.(*ETCDDriver).Put() - Key:/monitoring/configs/default_policy, value:push 
[CLOUD-BARISTA].[ERROR]: 2021-11-25 11:43:30 mechanism.go:45, github.com/cloud-barista/cb-dragonfly/pkg/modules/procedure.startPushModule() - failed to initialize collector manager 
panic: configmaps is forbidden: User "system:serviceaccount:cloud-barista:cb-dragonfly" cannot create resource "configmaps" in API group "" in the namespace "dragonfly"

goroutine 1 [running]:
main.main()
        /go/src/github.com/cloud-barista/cb-dragonfly/pkg/main.go:54 +0x2e6

DF 코드를 살펴 보니
PR #124 에서

const (
	Namespace = "dragonfly"

코드가 추가되었습니다.


CB-Dragonfly repo 내에서 관리하고 계신 CB-Dragonfly Helm chart 에서는
dragonfly 라는 K8s 네임스페이스를 사용하시는 것으로 보이는데,

cb-operator repo 내에서 관리하고 있는 CB-Dragonfly Helm chart 에서는
Cloud-Barista와 관련된 대부분의 파드가 cloud-barista 라는 K8s 네임스페이스에서 실행되도록 구성되어 있습니다.

따라서, Namespace = "dragonfly" 가 DF 소스코드에 하드코딩 되는 것 보다는
다른 방법이 좋을 것 같습니다.
(현재 상태대로는 cb-operator repo 내에서 관리하고 있는 CB-Dragonfly Helm chart 가 정상적으로 작동하지 않습니다.)

@hyokyungk @inno-cloudbarista @devpjh121 검토를 부탁드립니다~~

What you expected to happen
:

How to reproduce it (as minimally and precisely as possible)
:

Anything else we need to know?
:

Environment

  • Source version or branch:
  • OS:
  • Others:

Proposed solution
:

Any other context
:

TCP/IP connection close 종료 로직 추가

현재 CB-Dragonfly는 프로세스 수행 시간 동안 full connection이 보장되어야함.
비정상 종료 및 강제 종료 시 TCP/IP connection 종료 로직 보강 필요

Remove cb-dragonfly binary file from repo

현재 cb-dragonfly/bin/ 에 cb-dragonfly binary file 이 존재합니다.

혹시 잘못 올라간 것이라면 / 꼭 필요한 것이 아니라면, repo 에서 삭제하고 .gitignore 에 추가하는 것이 어떨까요? ^^

vm의 agent 가 정상동작하는지 체크 방법 문의

API version
CB-SPIDER : 0.7
CB-TUMBLEBUG : 0.7
CB-DRAGONFLY : 0.7

spider, tumblebug, dragonfly 모두 running 중입니다.
Tumble을 통해 mcis와 vm(aws)을 생성했습니다.

ubuntu user로 접속하여
sudo service telegraf status 를 실행하면
Apr 11 10:51:42 ip-10-0-0-194 telegraf[7256]: {"time":"2023-04-11T10:51:42.07618761Z","id":"","remote_ip":"xxx.xxx.xxx.xxx","host":"yyy.yyy.yyy.yyy:8888","m
Apr 11 10:51:44 ip-10-0-0-194 telegraf[7256]: {"time":"2023-04-11T10:51:44.051470471Z","id":"","remote_ip":"xxx.xxx.xxx.xxx","host":"yyy.yyy.yyy.yyy:8888","
가 확인됩니다.

dragonfly와 vm에 설정확인 하는 방법이 무엇이 있을까요?
data가 정상적으로 쌓이는지 확인하는 방법도 알려주세요


추가로 mcis의 vm에서 cat /etc/hosts 를 했을 때,
cb-dragonfly-kafka cb-dragonfly 는 dragonfly vm의 ip
cb-agent 는 vm자신의 public ip 가 확인됩니다.

dragonfly vm에 docker로 띄워 놓았고
influx db에 Data가 쌓이다가 멈춘 것 같습니다.

API 호출시 duration이 5m 이라 결과가 안나오는 것 같아서 길게 줬으나 받아온 결과가 없습니다.

호출한 API는 Get vm monitoring info
{{ip}}:{{port}}/dragonfly/ns/:ns_id/mcis/:mcis_id/vm/:vm_id/metric/:metric_name/info?periodType=m&statisticsCriteria=last&duration=5m
-> periodType=d&statisticsCriteria=last&duration=365d

결과 :
{
"message": "not found metric data, metric=cpu"
}

namespace = buybay
mcisId = goods
vmId = electronics-1 일 때

influx db에서
database = cbmon 으로 하고
select count(*) from cpu where nsId='buybay' and mcisId='goods' and vmId='electronics-1';
결과는 각 항목에 대해 530만 확인됩니다.

조회쿼리나 log 확인 방법이 있을까요?
dragonfly vm 은 계속 떠 있는 상태이고
mcis의 vm은 지난 3월 29일에 생성했습니다.

[bug] 최신 데이터 조회 API 에러

  • CB-Dragonlfy disk 메트릭 조회 시 assignment to entry in nil map 에러 발생
cb-dragonfly             | echo: http: panic serving 129.254.175.187:53280: assignment to entry in nil map
cb-dragonfly             | goroutine 2295484 [running]:
cb-dragonfly             | net/http.(*conn).serve.func1(0xc000136320)
cb-dragonfly             |      /usr/local/go/src/net/http/server.go:1767 +0x139
cb-dragonfly             | panic(0x98f4a0, 0xaee780)
cb-dragonfly             |      /usr/local/go/src/runtime/panic.go:679 +0x1b2
cb-dragonfly             | github.com/cloud-barista/cb-dragonfly/pkg/manager.(*APIServer).GetVMRealtimeMonInfo(0xc000108700, 0xb151a0, 0xc0006186c0, 0xc2d51cb5, 0x58b4daf8b8e663f9)
cb-dragonfly             |      /go/src/github.com/cloud-barista/cb-dragonfly/pkg/manager/apiserver.go:401 +0x90a
cb-dragonfly             | github.com/labstack/echo/v4.(*Echo).add.func1(0xb151a0, 0xc0006186c0, 0xa3f69b, 0x1b)
cb-dragonfly             |      /go/src/github.com/cloud-barista/pkg/mod/github.com/labstack/echo/[email protected]/echo.go:505 +0x8a
cb-dragonfly             | github.com/labstack/echo/v4/middleware.CORSWithConfig.func1.1(0xb151a0, 0xc0006186c0, 0x1, 0x1)
cb-dragonfly             |      /go/src/github.com/cloud-barista/pkg/mod/github.com/labstack/echo/[email protected]/middleware/cors.go:121 +0x477
cb-dragonfly             | github.com/labstack/echo/v4.(*Echo).ServeHTTP(0xc0001f8000, 0xb036c0, 0xc0002920e0, 0xc000106100)
cb-dragonfly             |      /go/src/github.com/cloud-barista/pkg/mod/github.com/labstack/echo/[email protected]/echo.go:616 +0x22a
cb-dragonfly             | net/http.serverHandler.ServeHTTP(0xc000108540, 0xb036c0, 0xc0002920e0, 0xc000106100)
cb-dragonfly             |      /usr/local/go/src/net/http/server.go:2802 +0xa4
cb-dragonfly             | net/http.(*conn).serve(0xc000136320, 0xb04e40, 0xc00041a380)
cb-dragonfly             |      /usr/local/go/src/net/http/server.go:1890 +0x875
cb-dragonfly             | created by net/http.(*Server).Serve
cb-dragonfly             |      /usr/local/go/src/net/http/server.go:2927 +0x38e
cb-dragonfly             | echo: http: panic serving 129.254.175.187:53290: assignment to entry in nil map
cb-dragonfly             | goroutine 2297387 [running]:
cb-dragonfly             | net/http.(*conn).serve.func1(0xc0027cf680)
cb-dragonfly             |      /usr/local/go/src/net/http/server.go:1767 +0x139
cb-dragonfly             | panic(0x98f4a0, 0xaee780)
cb-dragonfly             |      /usr/local/go/src/runtime/panic.go:679 +0x1b2
cb-dragonfly             | github.com/cloud-barista/cb-dragonfly/pkg/manager.(*APIServer).GetVMRealtimeMonInfo(0xc000108700, 0xb151a0, 0xc000618870, 0xd8a111e, 0x6d79c3302fa41783)
cb-dragonfly             |      /go/src/github.com/cloud-barista/cb-dragonfly/pkg/manager/apiserver.go:401 +0x90a
cb-dragonfly             | github.com/labstack/echo/v4.(*Echo).add.func1(0xb151a0, 0xc000618870, 0xa3f69b, 0x1b)
cb-dragonfly             |      /go/src/github.com/cloud-barista/pkg/mod/github.com/labstack/echo/[email protected]/echo.go:505 +0x8a
cb-dragonfly             | github.com/labstack/echo/v4/middleware.CORSWithConfig.func1.1(0xb151a0, 0xc000618870, 0x1, 0x1)
cb-dragonfly             |      /go/src/github.com/cloud-barista/pkg/mod/github.com/labstack/echo/[email protected]/middleware/cors.go:121 +0x477
cb-dragonfly             | github.com/labstack/echo/v4.(*Echo).ServeHTTP(0xc0001f8000, 0xb036c0, 0xc0001087e0, 0xc002749800)
cb-dragonfly             |      /go/src/github.com/cloud-barista/pkg/mod/github.com/labstack/echo/[email protected]/echo.go:616 +0x22a
cb-dragonfly             | net/http.serverHandler.ServeHTTP(0xc000108540, 0xb036c0, 0xc0001087e0, 0xc002749800)
cb-dragonfly             |      /usr/local/go/src/net/http/server.go:2802 +0xa4
cb-dragonfly             | net/http.(*conn).serve(0xc0027cf680, 0xb04e40, 0xc002739e80)
cb-dragonfly             |      /usr/local/go/src/net/http/server.go:1890 +0x875
cb-dragonfly             | created by net/http.(*Server).Serve
cb-dragonfly             |      /usr/local/go/src/net/http/server.go:2927 +0x38e
cb-dragonfly             | echo: http: panic serving 129.254.175.187:53275: assignment to entry in nil map
cb-dragonfly             | goroutine 2294723 [running]:
cb-dragonfly             | net/http.(*conn).serve.func1(0xc000593a40)
cb-dragonfly             |      /usr/local/go/src/net/http/server.go:1767 +0x139
cb-dragonfly             | panic(0x98f4a0, 0xaee780)
cb-dragonfly             |      /usr/local/go/src/runtime/panic.go:679 +0x1b2
cb-dragonfly             | github.com/cloud-barista/cb-dragonfly/pkg/manager.(*APIServer).GetVMRealtimeMonInfo(0xc000108700, 0xb151a0, 0xc000618750, 0xc9ec363, 0x293a46a11aabfae7)
cb-dragonfly             |      /go/src/github.com/cloud-barista/cb-dragonfly/pkg/manager/apiserver.go:401 +0x90a
cb-dragonfly             | github.com/labstack/echo/v4.(*Echo).add.func1(0xb151a0, 0xc000618750, 0xa3f69b, 0x1b)
cb-dragonfly             |      /go/src/github.com/cloud-barista/pkg/mod/github.com/labstack/echo/[email protected]/echo.go:505 +0x8a
cb-dragonfly             | github.com/labstack/echo/v4/middleware.CORSWithConfig.func1.1(0xb151a0, 0xc000618750, 0x1, 0x1)
cb-dragonfly             |      /go/src/github.com/cloud-barista/pkg/mod/github.com/labstack/echo/[email protected]/middleware/cors.go:121 +0x477
cb-dragonfly             | github.com/labstack/echo/v4.(*Echo).ServeHTTP(0xc0001f8000, 0xb036c0, 0xc000108380, 0xc000294600)
cb-dragonfly             |      /go/src/github.com/cloud-barista/pkg/mod/github.com/labstack/echo/[email protected]/echo.go:616 +0x22a
cb-dragonfly             | net/http.serverHandler.ServeHTTP(0xc000108540, 0xb036c0, 0xc000108380, 0xc000294600)
cb-dragonfly             |      /usr/local/go/src/net/http/server.go:2802 +0xa4
cb-dragonfly             | net/http.(*conn).serve(0xc000593a40, 0xb04e40, 0xc0005cecc0)
cb-dragonfly             |      /usr/local/go/src/net/http/server.go:1890 +0x875
cb-dragonfly             | created by net/http.(*Server).Serve
cb-dragonfly             |      /usr/local/go/src/net/http/server.go:2927 +0x38e
cb-dragonfly             | echo: http: panic serving 129.254.175.187:53291: assignment to entry in nil map
cb-dragonfly             | goroutine 2297578 [running]:
cb-dragonfly             | net/http.(*conn).serve.func1(0xc000592140)
cb-dragonfly             |      /usr/local/go/src/net/http/server.go:1767 +0x139
cb-dragonfly             | panic(0x98f4a0, 0xaee780)
cb-dragonfly             |      /usr/local/go/src/runtime/panic.go:679 +0x1b2
cb-dragonfly             | github.com/cloud-barista/cb-dragonfly/pkg/manager.(*APIServer).GetVMRealtimeMonInfo(0xc000108700, 0xb151a0, 0xc0006181b0, 0x6cef9418, 0x7ec36b3af7042e29)
cb-dragonfly             |      /go/src/github.com/cloud-barista/cb-dragonfly/pkg/manager/apiserver.go:401 +0x90a
cb-dragonfly             | github.com/labstack/echo/v4.(*Echo).add.func1(0xb151a0, 0xc0006181b0, 0xa3f69b, 0x1b)
cb-dragonfly             |      /go/src/github.com/cloud-barista/pkg/mod/github.com/labstack/echo/[email protected]/echo.go:505 +0x8a
cb-dragonfly             | github.com/labstack/echo/v4/middleware.CORSWithConfig.func1.1(0xb151a0, 0xc0006181b0, 0x1, 0x1)
cb-dragonfly             |      /go/src/github.com/cloud-barista/pkg/mod/github.com/labstack/echo/[email protected]/middleware/cors.go:121 +0x477
cb-dragonfly             | github.com/labstack/echo/v4.(*Echo).ServeHTTP(0xc0001f8000, 0xb036c0, 0xc0001080e0, 0xc002748d00)
cb-dragonfly             |      /go/src/github.com/cloud-barista/pkg/mod/github.com/labstack/echo/[email protected]/echo.go:616 +0x22a
cb-dragonfly             | net/http.serverHandler.ServeHTTP(0xc000108540, 0xb036c0, 0xc0001080e0, 0xc002748d00)
cb-dragonfly             |      /usr/local/go/src/net/http/server.go:2802 +0xa4
cb-dragonfly             | net/http.(*conn).serve(0xc000592140, 0xb04e40, 0xc00041a200)
cb-dragonfly             |      /usr/local/go/src/net/http/server.go:1890 +0x875
cb-dragonfly             | created by net/http.(*Server).Serve
cb-dragonfly             |      /usr/local/go/src/net/http/server.go:2927 +0x38e
cb-dragonfly-influxdb    | [httpd] 172.18.0.4 - cbmon [13/Jul/2020:12:59:18 +0000] "POST /write?consistency=&db=cbmon&precision=ns&rp= HTTP/1.1" 204 0 "-" "InfluxDBClient" a9b8cbb2-c508-11ea-8784-0242ac120003 2810

Hard to find API document

What would you like to be enhanced
: API 문서를 CB-DF 저장소에서 찾기가 쉽지 않습니다.

Why is this needed
: 개발자, 사용자 편의성 ++

Proposed solution
: Readme에 링크 추가

Check appropriate 3rd party pkg

  • 관련 issue: cloud-barista/cb-tumblebug#368
  • 현재 CB-Dragonfly 에서는 shaodan/kapacitor-client 패키지가 많이 활용되고 있습니다.
  • 그런데 이 패키지는 공식 repo 가 아니라 forked repo 라서, 향후 업데이트가 잘 될 것이라고 기대하기 어려운 점 등이 있습니다.
  • CB-Dragonfly 에서 이를 공식 repo 로 대체하는 것이 가능할까요?
    • 예:
      import "github.com/shaodan/kapacitor-client" => import "github.com/influxdata/kapacitor/client/v1" 로 replace 하고,
      (필요 시 go.mod 를 업데이트 하고,)
      CB-Dragonfly 가 정상적으로 build & run 되는지 테스트

한 번 확인해 주시면 감사하겠습니다.. ^^

influxdb: 8083 포트는 더 이상 사용되지 않음

  • influxdb 1.7 부터, 8083 포트를 통해서 제공되던 웹서비스가 끝났다고 합니다.
  • 현황
    • cb-dragonfly/docker-compose-dev-df.yaml: 28083:8083
    • cb-dragonfly/docker-compose.yaml: 28083:8083
  • 현황 2
    • Helm chart 이용하여 Cloud-Barista 실행했을 때, influxdb는 8086, 8088 포트를 listen 합니다.
NAMESPACE       NAME                                     TYPE        CLUSTER-IP       EXTERNAL-IP   PORT(S)
cloud-barista   cb-dragonfly-influxdb                    ClusterIP   10.96.188.11     <none>        8086/TCP,8088/TCP
cloud-barista   cb-restapigw-influxdb                    ClusterIP   10.106.10.169    <none>        8086/TCP,8088/TCP
  • 제안: 다음 라인을 삭제
    • cb-dragonfly/docker-compose-dev-df.yaml: 28083:8083
    • cb-dragonfly/docker-compose.yaml: 28083:8083

AWS, Ubuntu 18.04 VM에 대해 "failed to change telegraf permission" 에러 발생

What happened
:

[AWS VM 정보]

  • Region: Ohio (us-east-2)
  • Image: ami-01e7ca2ef94a0ae86 (amazon/ubuntu/images/hvm-ssd/ubuntu-bionic-18.04-amd64-server-20210224)
  • Spec: t2.micro

[Agent install request body]
POST http://localhost:9090/dragonfly/agent

{
  "ns_id": "ns01",
  "mcis_id": "jhseo",
  "vm_id": "aws-us-east-2-1",
  "public_ip": "18.219.61.250",
  "port": "22",
  "user_name": "cb-user",
  "ssh_key": "-----BEGIN RSA PRIVATE KEY-----\n...\n-----END RSA PRIVATE KEY-----",
  "cspType": "aws",
  "service_type": "vm"
}

[Agent install response body]

{
  "message": "failed to change telegraf permission, err=Process exited with status 1"
}

What you expected to happen
:

How to reproduce it (as minimally and precisely as possible)
:

Anything else we need to know?
:

Environment

  • Source version or branch:
    • CB-Spider 최신버전 (f09c6c3, Fri Oct 21 20:11:55 2022 +0900)
    • CB-Tumblebug 최신버전 (5b53b8c, Mon Oct 24 17:43:32 2022 +0900,
      PR cloud-barista/cb-tumblebug#1239 머지 이후)
    • CB-Dragonfly 최신버전
  • OS:
  • Others:

Proposed solution
:

Any other context
:
(CC @powerkimhub : "GusetOS 확장 지원 검증 시험" 중 발생한 이슈입니다.)

Config: YAML vs. env var

안녕하세요
설정 파일 관련하여 이슈를 올립니다.

CB-Store 에도 올렸던 이슈인데요
( cloud-barista/cb-store#14 )

현재 Dragonfly 의 환경설정은 conf/config.yaml 파일로 하도록 되어 있습니다.

올려 주신 Dockerfile 로 이미지를 만들었는데
(docker build --tag cloudbaristahub/cb-dragonfly:v0.1-20200327 .)

설정 파일에 {{influxdb_ip}}, {{password}} 등을 수정하지 않고 이미지를 만들어서 그런지

컨테이너 실행 (docker run -p 8094:8094 --name cb-dragonfly cloudbaristahub/cb-dragonfly:v0.1-20200327) 시 에러가 납니다.

panic: yaml: unmarshal errors:
  line 8: cannot unmarshal !!map into string
  line 17: cannot unmarshal !!map into string

goroutine 1 [running]:
main.main()
        /go/src/github.com/cloud-barista/cb-dragonfly/pkg/manager/main/main.go:35 +0x54e

호스트 머신에 올바른 config.yaml 을 두고
컨테이너 실행 시 -v 옵션으로 마운트하는 방법이 있긴 한데요

향후 docker-compose, Helm chart 등의 방식으로 deploy 하는 것을 고려하면
이는 적절한 방법이 아닌 것 같다는 생각이 들기는 합니다.

혹시, 환경 설정 항목들을 env var 형태로 받는 것에 대해서는 어떻게 생각하시는지 문의를 드립니다.

Ref: @powerkimhub @seokho-son

Need to fix store_conf.yaml

What happened
:
DF 메타정보가 알 수 없는 곳에 저장되고 있음

What you expected to happen
:
DF 메타정보가 dbpath: "$CBSTORE_ROOT/meta_db/dat" 에 저장됨

How to reproduce it (as minimally and precisely as possible)
:

Anything else we need to know?
:

Environment

  • Source version or branch:
  • OS:
  • Others:

Proposed solution
:

conf/store_conf.yaml 를 다음과 같이 수정

-   dbpath: "meta_db/dat"
+   dbpath: "$CBSTORE_ROOT/meta_db/dat"

TODO to @jihoon-seo
:

  • cb-operator repo 의 해당 파일들 (2곳) 도 수정 필요
  • DF repo와 cb-operator repo 의 docker-compose.yaml 에 volume mount 추가
  • cb-operator repo 의 Helm chart 에 PV/PVC 추가

`Store.Put()` vs. `StorePut()`

DF 소스에서

대부분 cbstore.GetInstance().StorePut() 함수를 활용하는데 (reference: 18개)

cbstore.GetInstance().Store.Put() 함수를 활용하는 reference가 1개 있습니다.

cb-dragonfly/pkg/core/agent/metadata.go

func (a AgentListManager) putAgentListToStore(agentList map[string]AgentInfo) error {
	agentListBytes, err := json.Marshal(agentList)
	if err != nil {
		return errors.New(fmt.Sprintf("failed to convert agentList format to json, error=%s", err))
	}
	err = cbstore.GetInstance().Store.Put(AgentListKey, string(agentListBytes))
	if err != nil {
		return errors.New(fmt.Sprintf("failed to put agentList, error=%s", err))
	}
	return nil
}

이것이 의도된 것인지 아니면 일관성을 맞춰야 하는 것인지 문의드립니다. 😊

Unintended file `tmp` exists in CB-DF repo

What happened
:
PR #124 에서
CB-DF repo root에 tmp 라는 파일이 추가되었는데,

혹시 반드시 필요한 것이 아니라면
삭제하는 것이 좋을 것 같습니다. 😊

What you expected to happen
:

How to reproduce it (as minimally and precisely as possible)
:

Anything else we need to know?
:

Environment

  • Source version or branch:
  • OS:
  • Others:

Proposed solution
:

Any other context
:

Should NS be included in Dragonfly REST API path?

@seokho-son @powerkimhub

According to cloud-barista/cb-tumblebug#203,

one should call Dragonfly REST API like this:

curl -sX GET http://localhost:9090/dragonfly/mcis/aws-us-east-1-shson/vm/aws-us-east-1-shson-01/metric/cpu/rt-info?statisticsCriteria=max

But the logical hierarchy of NS, MCIS, and VM (in Tumblebug) is like this:

  • NS
    • MCIS
      • VM

How about this option?

curl -sX GET http://localhost:9090/dragonfly/ns/my-namespace/mcis/aws-us-east-1-shson/vm/aws-us-east-1-shson-01/metric/cpu/rt-info?statisticsCriteria=max

When using Docker Compose, `influxdb`'s `external_port` should be 8086, not 28086, in `config.yaml`

@hyokyungk 한 번 확인해 주시면 감사하겠습니다. 😊

What happened
:
CB-DF README 에 나와 있는 대로
conf/config.yaml 파일 파일에서
influxdb: 아래의 external_port: 의 값이 28086 인 상태에서

   # influxdb connection info
   influxdb:
     endpoint_url: http://cb-dragonfly-influxdb           # endpoint for influxDB
     internal_port: 8086
     external_port: 28086

sudo make compose-up 을 실행하면

cb-dragonfly 가 http://cb-dragonfly-influxdb:28086 핑에 실패하여 cb-dragonfly 컨테이너가 계속 종료&재시작됨

 tcp 172.18.0.4:28086: connect: connection refused 
panic: Get "http://cb-dragonfly-influxdb:28086/ping?wait_for_leader=5s": dial tcp 172.18.0.4:28086: connect: connection refused

[CLOUD-BARISTA].[ERROR]: 2021-07-06 16:06:09 influxdb.go:63, github.com/cloud-barista/cb-dragonfly/pkg/metricstore/influxdb/v1.Storage.Initialize() - failed to ping InfluxDB, error=Get "http://cb-dragonfly-influxdb:28086/ping?wait_for_leader=5s": dial tcp 172.18.0.4:28086: connect: connection refused 
[CLOUD-BARISTA].[ERROR]: 2021-07-06 16:06:09 main.go:84, main.main() - failed to initialize influxDB, error=Get "http://cb-dragonfly-influxdb:28086/ping?wait_for_leader=5s": dialmain.main()
        /go/src/github.com/cloud-barista/cb-dragonfly/pkg/manager/main/main.go:85 +0x5d0
time="2021-07-06T16:06:21+09:00" level=info msg="create tick file with name default.tick"
panic: Get "http://cb-dragonfly-influxdb:28086/ping?wait_for_leader=5s": dial tcp 172.18.0.4:28086: connect: connection refused

goroutine 1 [running]:
main.main()
        /go/src/github.com/cloud-barista/cb-dragonfly/pkg/manager/main/main.go:85 +0x5d0
time="2021-07-06T16:06:37+09:00" level=info msg="create tick file with name default.tick"
[CLOUD-BARISTA].[ERROR]: 2021-07-06 16:06:37 influxdb.go:63, github.com/cloud-barista/cb-dragonfly/pkg/metricstore/influxdb/v1.Storage.Initialize() - failed to ping InfluxDB, error=Get "http://cb-dragonfly-influxdb:28086/ping?wait_for_leader=5s": dial tcp 172.18.0.4:28086: connect: connection refused 
[CLOUD-BARISTA].[ERROR]: 2021-07-06 16:06:37 main.go:84, main.main() - failed to initialize influxDB, error=Get "http://cb-dragonfly-influxdb:28086/ping?wait_for_leader=5s": dial tcp 172.18.0.4:28086: connect: connection refused 
panic: Get "http://cb-dragonfly-influxdb:28086/ping?wait_for_leader=5s": dial tcp 172.18.0.4:28086: connect: connection refused

goroutine 1 [running]:
main.main()
        /go/src/github.com/cloud-barista/cb-dragonfly/pkg/manager/main/main.go:85 +0x5d0

external_port: 값을 8086 으로 수정하고

influxdb:
  endpoint_url: http://cb-dragonfly-influxdb       # endpoint for influxDB
  internal_port: 8086
  external_port: 8086

sudo make compose-up 을 실행하면
cb-dragonfly 가 문제 없이 실행됨

❯ docker ps
CONTAINER ID        IMAGE                                              COMMAND                  CREATED             STATUS              PORTS                                                NAMES
bddbb7a12a89        chronograf:1.8.4-alpine                            "/entrypoint.sh --in…"   20 minutes ago      Up 12 minutes       0.0.0.0:8888->8888/tcp                               cb-dragonfly-chronograf
5ead5c5eeb27        cloudbaristaorg/cb-dragonfly:espresso-v0.1-kafka   "cb-dragonfly"           20 minutes ago      Up 12 minutes       0.0.0.0:9090->9090/tcp, 0.0.0.0:9999->9999/tcp       cb-dragonfly
e9af6eeb1a2b        kapacitor:1.5                                      "/entrypoint.sh kapa…"   20 minutes ago      Up 12 minutes       0.0.0.0:29092->9092/tcp                              cb-dragonfly-kapacitor
d944b857e45d        influxdb:1.8-alpine                                "/entrypoint.sh infl…"   20 minutes ago      Up 12 minutes       0.0.0.0:28086->8086/tcp, 0.0.0.0:28088->8088/tcp     cb-dragonfly-influxdb
f932dde06d05        wurstmeister/kafka:2.12-2.4.1                      "start-kafka.sh"         29 minutes ago      Up 12 minutes       0.0.0.0:9092->9092/tcp                               cb-dragonfly-kafka
09f05fec263d        wurstmeister/zookeeper                             "/bin/sh -c '/usr/sb…"   29 minutes ago      Up 12 minutes       22/tcp, 2888/tcp, 3888/tcp, 0.0.0.0:2181->2181/tcp   cb-dragonfly-zookeeper

What you expected to happen
:

How to reproduce it (as minimally and precisely as possible)
:

Anything else we need to know?
:

Environment

  • Source version or branch:
  • OS:
  • Others:

Proposed solution
:

(안)
CB-Dragonfly 가
Docker Compose 모드, Helm chart 모드 등으로 실행될 때에는
config.yaml
external_port: 의 값을 읽지 않고
internal_port: 의 값을 읽도록 수정

Any other context
:

cb-operator repo 안에 있는
cb-dragonflyconfig.yaml 에 대해서는
external_port: 의 값을 28086 에서 8086 으로 수정했음

[cb-operator]

cb-operator 등 Docker Compose 로 실행 시, kafka not ready 이면 DF 종료됨

What happened
:

docker-compose.yaml 에 아래와 같이 depends_on: 옵션을 명시해 두었는데도

    depends_on:
#     - cb-restapigw
      - cb-dragonfly-zookeeper
      - cb-dragonfly-kafka
      - cb-dragonfly-influxdb
      - cb-dragonfly-kapacitor

cb-operator 등 Docker Compose 로 실행 시, kafka not ready 이면 DF 종료됨

./operator info

[v]Status of Cloud-Barista runtimes
          Name                         Command               State                                                    Ports
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------
cb-dragonfly                cb-dragonfly                     Exit 2
cb-dragonfly-influxdb       /entrypoint.sh influxd           Up       0.0.0.0:28083->8083/tcp, 0.0.0.0:28086->8086/tcp
cb-dragonfly-kafka          start-kafka.sh                   Up       0.0.0.0:9092->9092/tcp
cb-dragonfly-kapacitor      /entrypoint.sh kapacitord        Up       0.0.0.0:29092->9092/tcp
cb-dragonfly-zookeeper      /bin/sh -c /usr/sbin/sshd  ...   Up       0.0.0.0:2181->2181/tcp, 22/tcp, 2888/tcp, 3888/tcp
cb-ladybug                  /app/cb-ladybug                  Up       0.0.0.0:8080->8080/tcp
cb-restapigw                /app/cb-restapigw -c /app/ ...   Up       0.0.0.0:8000->8000/tcp, 0.0.0.0:8001->8001/tcp
cb-restapigw-grafana        /run.sh                          Up       0.0.0.0:3100->3000/tcp
cb-restapigw-influxdb       /entrypoint.sh influxd           Up       0.0.0.0:8083->8083/tcp, 0.0.0.0:8086->8086/tcp
cb-restapigw-jaeger         /go/bin/all-in-one-linux - ...   Up       14250/tcp, 0.0.0.0:14268->14268/tcp, 0.0.0.0:16686->16686/tcp, 5775/udp, 5778/tcp, 6831/udp, 6832/udp
cb-spider                   /root/go/src/github.com/cl ...   Up       0.0.0.0:1024->1024/tcp, 0.0.0.0:2048->2048/tcp, 4096/tcp
cb-tumblebug                /app/src/cb-tumblebug            Up       0.0.0.0:1323->1323/tcp, 0.0.0.0:50252->50252/tcp
cb-tumblebug-phpliteadmin   /usr/bin/caddy --conf /etc ...   Up       0.0.0.0:2015->2015/tcp, 443/tcp, 80/tcp

docker logs cb-dragonfly

[CLOUD-BARISTA].[INFO]: 2021-04-06 15:03:43 nutsdb-driver.go:32, github.com/cloud-barista/cb-store/store-drivers/nutsdb-driver.initialize() - ######## dbfile: /go/src/github.com/cloud-barista/cb-dragonfly/meta_db/dat
kafka is not responding dial tcp 172.18.0.6:9092: connect: connection refused
panic: dial tcp 172.18.0.6:9092: connect: connection refused

goroutine 1 [running]:
main.main()
        /go/src/github.com/cloud-barista/cb-dragonfly/pkg/manager/main/main.go:43 +0x3d8

What you expected to happen
:
kafka 가 ready 될 때까지 DF 가 기다림

How to reproduce it (as minimally and precisely as possible)
:
./operator run

Anything else we need to know?
:

Environment

  • Source version or branch:
  • OS:
  • Others:

Proposed solution
:

Any other context
:

Agent installation fails in v0.6.9

What happened

  • 에이전트 설치 실패

CB-TB 최신 릴리스 (v0.6.18 + "service_type": "mcis" 처리)를 활용하여, CB-DF (v0.6.9) 에 에이전트 설치 요청시,

HTTP 500 상태와 다음 에러 메시지를 반환함.
"result": "{"message":"failed to change telegraf permission, err=Process exited with status 1"}\n"

How to reproduce it (as minimally and precisely as possible)

Anything else we need to know?
기존 CB-DF 0.6.7 에서는 에이전트 설치까지는 문제가 없었으며, v0.6.9 로 시도시 에이전트 설치 문제도 발생함. (기존에도 push 모니터링은 동작에 문제가 있었음)

Any other context

해당 CB-DF의 로그

[2022-11-23T22:06:40+09:00] <MCIS> collector scheduler - Now Scheduling ###
[2022-11-23T22:06:40+09:00] <MCIS> Add Topics Queue ## : []
[2022-11-23T22:06:40+09:00] <MCIS> Del Topics Queue ## : []
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:40 nutsdb-driver.go:116, github.com/cloud-barista/cb-store/store-drivers/nutsdb-driver.(*NUTSDBDriver).Put() - Key:/push/collectorTopicMap, value:{"TopicMap":{},"CollectorPerAgentCnt":[]} 
mysql: [Warning] Using a password on the command line interface can be insecure.
ERROR 1007 (HY000) at line 1: Can't create database 'sysbench'; database exists
mysql: [Warning] Using a password on the command line interface can be insecure.
ERROR 1396 (HY000) at line 1: Operation CREATE USER failed for 'sysbench'@'localhost'
mysql: [Warning] Using a password on the command line interface can be insecure.
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:44 sshrun.go:84, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Close() - call Close() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:44 sshrun.go:133, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.SSHRun() - call SSHRun() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:44 sshrun.go:41, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Connect() - call Connect() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:45 sshrun.go:90, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.RunCommand() - call RunCommand() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:45 sshrun.go:84, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Close() - call Close() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:45 sshrun.go:159, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.SSHCopy() - call SSHCopy() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:45 sshrun.go:41, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Connect() - call Connect() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:45 sshrun.go:123, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Copy() - call Copy() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:45 sshrun.go:84, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Close() - call Close() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:45 sshrun.go:133, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.SSHRun() - call SSHRun() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:45 sshrun.go:41, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Connect() - call Connect() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:46 sshrun.go:90, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.RunCommand() - call RunCommand() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:46 sshrun.go:84, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Close() - call Close() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:46 sshrun.go:133, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.SSHRun() - call SSHRun() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:46 sshrun.go:41, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Connect() - call Connect() 
mysql: [Warning] Using a password on the command line interface can be insecure.
ERROR 1007 (HY000) at line 1: Can't create database 'sysbench'; database exists
mysql: [Warning] Using a password on the command line interface can be insecure.
ERROR 1396 (HY000) at line 1: Operation CREATE USER failed for 'sysbench'@'localhost'
mysql: [Warning] Using a password on the command line interface can be insecure.
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:47 sshrun.go:90, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.RunCommand() - call RunCommand() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:47 sshrun.go:84, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Close() - call Close() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:47 sshrun.go:133, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.SSHRun() - call SSHRun() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:47 sshrun.go:41, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Connect() - call Connect() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:47 sshrun.go:84, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Close() - call Close() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:47 sshrun.go:133, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.SSHRun() - call SSHRun() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:47 sshrun.go:41, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Connect() - call Connect() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:47 sshrun.go:90, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.RunCommand() - call RunCommand() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:47 sshrun.go:84, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Close() - call Close() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:47 sshrun.go:133, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.SSHRun() - call SSHRun() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:47 sshrun.go:41, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Connect() - call Connect() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:48 sshrun.go:90, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.RunCommand() - call RunCommand() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:48 sshrun.go:84, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Close() - call Close() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:48 sshrun.go:133, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.SSHRun() - call SSHRun() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:48 sshrun.go:41, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Connect() - call Connect() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:48 sshrun.go:90, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.RunCommand() - call RunCommand() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:48 sshrun.go:84, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Close() - call Close() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:48 sshrun.go:159, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.SSHCopy() - call SSHCopy() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:48 sshrun.go:41, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Connect() - call Connect() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:48 nutsdb-driver.go:133, github.com/cloud-barista/cb-store/store-drivers/nutsdb-driver.(*NUTSDBDriver).Get() - Key:/monitoring/agents/ns01_mcis_mcis01_g2-1_gcp 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:49 nutsdb-driver.go:133, github.com/cloud-barista/cb-store/store-drivers/nutsdb-driver.(*NUTSDBDriver).Get() - Key:/monitoring/agents/ns01_mcis_mcis01_g2-1_gcp 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:49 sshrun.go:123, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Copy() - call Copy() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:49 sshrun.go:84, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Close() - call Close() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:49 sshrun.go:133, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.SSHRun() - call SSHRun() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:49 sshrun.go:41, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Connect() - call Connect() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:49 sshrun.go:90, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.RunCommand() - call RunCommand() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:49 sshrun.go:84, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Close() - call Close() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:49 sshrun.go:133, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.SSHRun() - call SSHRun() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:49 sshrun.go:41, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Connect() - call Connect() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:49 nutsdb-driver.go:133, github.com/cloud-barista/cb-store/store-drivers/nutsdb-driver.(*NUTSDBDriver).Get() - Key:/monitoring/agents/ns01_mcis_mcis01_g2-1_gcp 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:49 nutsdb-driver.go:133, github.com/cloud-barista/cb-store/store-drivers/nutsdb-driver.(*NUTSDBDriver).Get() - Key:/monitoring/agents/ns01_mcis_mcis01_g2-1_gcp 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:50 sshrun.go:90, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.RunCommand() - call RunCommand() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:50 sshrun.go:84, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Close() - call Close() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:50 sshrun.go:133, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.SSHRun() - call SSHRun() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:50 sshrun.go:41, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Connect() - call Connect() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:50 sshrun.go:90, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.RunCommand() - call RunCommand() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:50 sshrun.go:84, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Close() - call Close() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:50 sshrun.go:133, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.SSHRun() - call SSHRun() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:50 sshrun.go:41, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Connect() - call Connect() 
[2022-11-23T22:06:50+09:00] <MCIS> collector scheduler - Now Scheduling ###
[2022-11-23T22:06:50+09:00] <MCIS> Add Topics Queue ## : []
[2022-11-23T22:06:50+09:00] <MCIS> Del Topics Queue ## : []
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:50 nutsdb-driver.go:116, github.com/cloud-barista/cb-store/store-drivers/nutsdb-driver.(*NUTSDBDriver).Put() - Key:/push/collectorTopicMap, value:{"TopicMap":{},"CollectorPerAgentCnt":[]} 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:50 nutsdb-driver.go:133, github.com/cloud-barista/cb-store/store-drivers/nutsdb-driver.(*NUTSDBDriver).Get() - Key:/monitoring/agents/ns01_mcis_mcis01_g2-1_gcp 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:50 sshrun.go:90, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.RunCommand() - call RunCommand() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:50 sshrun.go:84, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Close() - call Close() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:50 sshrun.go:133, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.SSHRun() - call SSHRun() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:50 sshrun.go:41, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Connect() - call Connect() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:50 sshrun.go:90, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.RunCommand() - call RunCommand() 
usermod: no changes
Job for telegraf.service failed because the control process exited with error code.
See "systemctl status telegraf.service" and "journalctl -xe" for details.
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:50 sshrun.go:84, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Close() - call Close() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:50 sshrun.go:133, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.SSHRun() - call SSHRun() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:50 sshrun.go:41, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Connect() - call Connect() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:51 sshrun.go:90, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.RunCommand() - call RunCommand() 
Removed /etc/systemd/system/multi-user.target.wants/telegraf.service.
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:51 sshrun.go:90, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.RunCommand() - call RunCommand() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:51 sshrun.go:84, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Close() - call Close() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:51 sshrun.go:133, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.SSHRun() - call SSHRun() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:51 sshrun.go:41, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Connect() - call Connect() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:51 sshrun.go:84, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Close() - call Close() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:51 sshrun.go:133, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.SSHRun() - call SSHRun() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:51 sshrun.go:41, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Connect() - call Connect() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:51 sshrun.go:90, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.RunCommand() - call RunCommand() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:51 sshrun.go:84, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Close() - call Close() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:51 sshrun.go:133, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.SSHRun() - call SSHRun() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:51 sshrun.go:41, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Connect() - call Connect() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:51 sshrun.go:90, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.RunCommand() - call RunCommand() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:52 sshrun.go:90, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.RunCommand() - call RunCommand() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:52 sshrun.go:84, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Close() - call Close() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:52 sshrun.go:133, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.SSHRun() - call SSHRun() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:52 sshrun.go:41, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Connect() - call Connect() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:52 sshrun.go:84, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Close() - call Close() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:52 sshrun.go:133, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.SSHRun() - call SSHRun() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:52 sshrun.go:41, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Connect() - call Connect() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:52 sshrun.go:90, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.RunCommand() - call RunCommand() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:52 sshrun.go:84, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Close() - call Close() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:52 sshrun.go:133, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.SSHRun() - call SSHRun() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:52 sshrun.go:41, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Connect() - call Connect() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:52 nutsdb-driver.go:133, github.com/cloud-barista/cb-store/store-drivers/nutsdb-driver.(*NUTSDBDriver).Get() - Key:/monitoring/agents/ns01_mcis_mcis01_g1-1_aws 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:52 sshrun.go:90, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.RunCommand() - call RunCommand() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:52 sshrun.go:84, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Close() - call Close() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:52 nutsdb-driver.go:133, github.com/cloud-barista/cb-store/store-drivers/nutsdb-driver.(*NUTSDBDriver).Get() - Key:/monitoring/agents/ns01_mcis_mcis01_g1-1_aws 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:53 sshrun.go:90, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.RunCommand() - call RunCommand() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:53 sshrun.go:84, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Close() - call Close() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:53 sshrun.go:133, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.SSHRun() - call SSHRun() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:53 sshrun.go:41, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Connect() - call Connect() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:53 nutsdb-driver.go:133, github.com/cloud-barista/cb-store/store-drivers/nutsdb-driver.(*NUTSDBDriver).Get() - Key:/monitoring/agents/ns01_mcis_mcis01_g1-1_aws 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:53 nutsdb-driver.go:133, github.com/cloud-barista/cb-store/store-drivers/nutsdb-driver.(*NUTSDBDriver).Get() - Key:/monitoring/agents/ns01_mcis_mcis01_g1-1_aws 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:53 nutsdb-driver.go:133, github.com/cloud-barista/cb-store/store-drivers/nutsdb-driver.(*NUTSDBDriver).Get() - Key:/monitoring/agents/ns01_mcis_mcis01_g1-1_aws 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:53 sshrun.go:90, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.RunCommand() - call RunCommand() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:53 sshrun.go:84, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Close() - call Close() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:53 sshrun.go:133, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.SSHRun() - call SSHRun() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:53 sshrun.go:41, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Connect() - call Connect() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:54 sshrun.go:90, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.RunCommand() - call RunCommand() 
usermod: no changes
Job for telegraf.service failed because the control process exited with error code.
See "systemctl status telegraf.service" and "journalctl -xe" for details.
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:54 sshrun.go:84, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Close() - call Close() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:54 sshrun.go:133, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.SSHRun() - call SSHRun() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:54 sshrun.go:41, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Connect() - call Connect() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:55 sshrun.go:90, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.RunCommand() - call RunCommand() 
Removed /etc/systemd/system/multi-user.target.wants/telegraf.service.
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:55 sshrun.go:84, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Close() - call Close() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:55 sshrun.go:133, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.SSHRun() - call SSHRun() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:55 sshrun.go:41, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Connect() - call Connect() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:56 sshrun.go:90, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.RunCommand() - call RunCommand() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:56 sshrun.go:84, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Close() - call Close() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:56 sshrun.go:133, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.SSHRun() - call SSHRun() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:56 sshrun.go:41, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Connect() - call Connect() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:57 sshrun.go:90, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.RunCommand() - call RunCommand() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:57 sshrun.go:84, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Close() - call Close() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:57 sshrun.go:133, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.SSHRun() - call SSHRun() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:57 sshrun.go:41, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Connect() - call Connect() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:57 sshrun.go:90, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.RunCommand() - call RunCommand() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:57 sshrun.go:84, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Close() - call Close() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:57 sshrun.go:133, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.SSHRun() - call SSHRun() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:57 sshrun.go:41, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Connect() - call Connect() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:58 sshrun.go:90, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.RunCommand() - call RunCommand() 
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:06:58 sshrun.go:84, github.com/cloud-barista/cb-spider/cloud-control-manager/vm-ssh.Close() - call Close() 
[2022-11-23T22:07:00+09:00] <MCIS> collector scheduler - Now Scheduling ###
[2022-11-23T22:07:00+09:00] <MCIS> Add Topics Queue ## : []
[2022-11-23T22:07:00+09:00] <MCIS> Del Topics Queue ## : []
[CLOUD-BARISTA].[INFO]: 2022-11-23 22:07:00 nutsdb-driver.go:116, github.com/cloud-barista/cb-store/store-drivers/nutsdb-driver.(*NUTSDBDriver).Put() - Key:/push/collectorTopicMap, value:{"TopicMap":{},"CollectorPerAgentCnt":[]} 

Hierarchical key structure when storing metadata

@hyokyungk 한 번 검토를 부탁드립니다. 😊

What would you like to be enhanced
:

현재 CB-Bridge에서는
Cloud-Barista의 K8s 모드에서
각 FW (SP, TB, LB, DF 등) 가 nutsdb에 저장하던 metadata를
단일 etcd에 저장하는 것으로 변경 고려/진행 중에 있습니다. (관련 discussion)

이에 각 FW가 Key-Value storing 시 활용하고 있는 Key structure를 조사해 보니
CB-Dragonfly는
key space 의 root 에

  • agentlist (slash로 시작하지 않음)
  • {agentUUID} (상동)
  • delTopics/{topic}
  • {eventLog.Id}
  • config/{key}
  • collectorGroupTopic/{collectorIdx}
  • topic/{topic}

등의 key를 만들고 있습니다.
(관련 표: cloud-barista/cb-operator#139 (comment))

이들이 좀 더 정돈되면서도 Hierarchical structure 를 가지면 좋을 것 같습니다.

Why is this needed
:

Proposed solution
:

(제가 Dragonfly 의 data's logical relation/hierarchy/structure 를 잘 몰라서, 아래 예시가 적절하지 않을 수 있습니다.)

예:

  • /monitoring/agents/{agentID}
  • /monitoring/collectors/{collectorID}
  • /monitoring/topics/{topicID}
  • (TB에 의해 생성된 /ns/{nsId}/mcis/{mcisId}/vm/{vmId} 키 아래에
    /ns/{nsId}/mcis/{mcisId}/vm/{vmId}/metrics/{metric} 과 같이 추가하는 방법도 있겠으나
    적절할지는 모르겠습니다.)
  • ...

GCP, Debian 10 VM에 대해 "failed to get package. osType DEBIAN not supported" 에러 발생

What happened
:

[GCP VM 정보]

[Agent install request body]
POST http://localhost:9090/dragonfly/agent

{
  "ns_id": "ns01",
  "mcis_id": "jhseo",
  "vm_id": "gcp-us-central1-1",
  "public_ip": "34.134.240.49",
  "port": "22",
  "user_name": "cb-user",
  "ssh_key": "-----BEGIN RSA PRIVATE KEY-----\n...\n-----END RSA PRIVATE KEY-----\n",
  "cspType": "gcp",
  "service_type": "vm"
}

[Agent install response body]

{
  "message": "failed to get package. osType DEBIAN not supported"
}

What you expected to happen
:

How to reproduce it (as minimally and precisely as possible)
:

Anything else we need to know?
:

Environment

  • Source version or branch:
    • CB-Spider 최신버전 (f09c6c3, Fri Oct 21 20:11:55 2022 +0900)
    • CB-Tumblebug 최신버전 (5b53b8c, Mon Oct 24 17:43:32 2022 +0900,
      PR cloud-barista/cb-tumblebug#1239 머지 이후)
    • CB-Dragonfly 최신버전
  • OS:
  • Others:

Proposed solution
:

Any other context
:

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.