Giter Club home page Giter Club logo

Comments (149)

Eneuman avatar Eneuman commented on July 24, 2024 2

Hi @amsoedal
Yes, now it running smooth again. Thanks you !

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024 2

@amsoedal Yes, now its working again.
It did one pods/klinikportalservice-75588487b4-vtwp4: BackOff: Back-off restarting failed container but then it started to build.
Thanks!

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024 1

Thank you!
Let me know of there is anything I can do to help!

Feel free to access our cluster if you need to troubleshoot.

//Per

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024 1

@daniv-msft Yes we are interested in transitioning to Bridge to Kubernetes but two features are missing.

Pod Identity (as you said) and the ability to stop all other instances of a particular pod when we start debugging it.

We rely heavily on azure service bus and pub/sub, so when debugging a service, we don't want any other running instances, of the pod we are debugging,to receive the events.

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024 1

It's working fine now. Thank you!

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024 1

@amsoedal Yes, everything is working fine now. Thank you!

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024 1

@daniv-msft Hi, yes we are intrested in trying out the new bits. I just replied to @pragyamehta email about our current timeframe :)

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024 1

@rakeshvanga Can you please restart it again.
Getting theese issues now:

Using dev space 'dev' with controller 'aks-we-eclinic-dev'
Synchronizing files...10s
This version of Kubernetes is no longer supported. Please upgrade your cluster to a supported Kubernetes version and retry.

Restart normaly fixes this :)

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024 1

from dev-spaces.

philon-msft avatar philon-msft commented on July 24, 2024

No, we don't know of any issues that would cause synchronization to fail for you. Have any changes in your project caused more or bigger files to be synced?

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024

Hi and thanks for the quick reply

Nothing has changed in the project but he project is kinda large (around 20000 files).
I just tried again and now it synced fine and finished the build in just 1.4 minutes.

We keep seeing this performance degregation and it always occurs around 16:30 UTC.
Our large project stops synchronizing, and the smaller ones takes alot longer, but the next morning it work fine again.

We have tried 3 different networks from 3 different carriers with the same result.

We have run performance checks on our k8 but the node has plenty or resources left, and the application on the k8 runs smoth and responsive.

Does the synchronization happen just between our clients and our k8, or is there some other servers involved aswell?

from dev-spaces.

philon-msft avatar philon-msft commented on July 24, 2024

Hi @Eneuman , it sounds like there may be a pattern of high network traffic around 16:30 in nearby resources in the data center where your cluster and Controller are hosted. Are you in the West Europe region? There are some other Dev Spaces resources involved in the sync operation, but we don't have anything scheduled to run near at that time.

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024

@philon-msft Yes, I am in the West Europe region.

And like clockwork, the synchronization just stoped working again :(
"Timed out after 42s trying to list azds pods"

Running "kubectl get pods" returns the pods in under a second.

At the moment, dev spaces is unusable for us after 16:30.

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024

And now its back up and working without any changes on our side. This got to be a resource problem. Do you have any DevSpaces telemetry from the West Europe region you can take a look at? :)

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024

Synchornization stoped working again 6/11 14:45 UTC
Today 7/11 11:00 UTC it is still not working. All our project (event the small ones) receive Synchronization Timout.
Cluster looks fine, no changes has been made.

This is realy frustrating... Please help us figure out what is going on..

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024

Still not working :(

Using dev space dev with controller aks-we-eclinic-dev
Synchronizing files...4m 0s
Timed out waiting for file synchronization to complete

And the command azds down gives the following error:
Timed out after 1m trying to list services.

but azds list-up works fine.

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024

Still not working :(

azds down
Timed out after 1m trying to list services.

from dev-spaces.

daniv-msft avatar daniv-msft commented on July 24, 2024

@Eneuman Sorry you're encountering this issue. :( We're having a look into this, and will keep you updated.

from dev-spaces.

daniv-msft avatar daniv-msft commented on July 24, 2024

Adding @shpathak-msft, who is looking into this.

from dev-spaces.

shpathak-msft avatar shpathak-msft commented on July 24, 2024

@Eneuman Could you please reach out to me at [email protected]? I have some logs from our backend service that I would like to share with you.

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024

from dev-spaces.

shpathak-msft avatar shpathak-msft commented on July 24, 2024

Thank you for reaching out and glad to know code synchronization is now working for you. I suppose you already know this, but I would suggest reaching out to AKS support if you continue to see the "could not get apiVersions from Kubernetes: unable to retrieve the complete list of server APIs: metrics.k8s.io/v1beta1" error.

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024

Thanks you for getting synchronization working again.
This has happend to use before and it's blocking us from working when it does. Are there any actions being taken to stop this issue from occuring again?

from dev-spaces.

daniv-msft avatar daniv-msft commented on July 24, 2024

@Eneuman Happy that we unblocked you! The issue you encountered is tricky to investigate because it is either transient, or it relies on us restarting the component on our side associated with your Azure Dev Spaces controller. This component didn't show specific errors itself that would help us finding the root cause of this. When this issue happened, this also didn't match to a deployment on our side or to code changes.
We know that Azure Dev Spaces relies on a complex codebase and can be unreliable, and this is part of the reason why we decided to move to another approach.

If I remember correctly, you investigated in the past transitioning to Bridge to Kubernetes but cannot do it presently because you rely on pod identity for your services. Is this correct? We're presently investigating adding support for pod identity and, if it's the only blocker for you, we would be happy to provide you with early bits as soon as we have something to share.

from dev-spaces.

daniv-msft avatar daniv-msft commented on July 24, 2024

@Eneuman Thanks for your quick reply.
All right, as of today we indeed expect the targeted service to be backed by only one pod, and fail if it's not the case. We should be able to automatically decrease the number of pods to 1 when debugging, and increase it back afterwards. I'm bumping the priority of this in our backlog so that we align it with our pod identity story.

Regarding pod identity, could you please confirm that your team members are using Windows machines to develop? From previous discussions, I believe that you're using Windows yourself.
In this case, the target OS is important as we might need for pod identity to have a specific implementation for each OS (because networking is done very differently in Windows compared to Linux, for example).

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024

@daniv-msft Yes, we are only using Windows machines and Visual Studio 2019 Pro.

Really looking forward to trying the new bits out :)

from dev-spaces.

daniv-msft avatar daniv-msft commented on July 24, 2024

@Eneuman All right, thanks for confirming!
Sounds good, we'll keep in the loop as soon as we have a beta for pod identity.

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024

@daniv-msft I think we just got this issue again :(
Can you please take a look and see if you can get it running again?

Enabling debugging...
Timed out after 1m trying to list azds pods.

Regards
Per

from dev-spaces.

amsoedal avatar amsoedal commented on July 24, 2024

Hi @Eneuman, I've restarted some of the components on our side. Can you let me know if the issue keeps reproducing? Thanks!

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024

@amsoedal
Hi again, maby I was too fast there.
I still am having problems with atleast two of my pods.

One is giving me this error. Is this on my side ?

pods/klinikportalservice-757c5cc75b-cd6b7: BackOff: Back-off restarting failed container

from dev-spaces.

amsoedal avatar amsoedal commented on July 24, 2024

Hi @Eneuman, sorry you're still having issues! Could you please run kubectl logs pods/klinikportalservice-757c5cc75b-cd6b7 -p (you might need to specify a container name with -c, but it should prompt you if necessary) to see what happened?

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024

@amsoedal

Hmm, for some reason it seems like Azure ADO Cert is no longer valid. Maby a service issue?

kubectl logs -n dev pods/klinikportalservice-757c5cc75b-cd6b7 devspaces-build

2021-02-15T17:53:35.9003579Z | BuildService | TRACE | Starting build process.\nOperation context: <json>{"clientRequestId":"5a8260c7-6a35-4651-ba04-b960705bc226","correlationRequestId":"113b9760-35ec-4773-b952-a3733a6be493","requestId":null,"userSubscriptionId":null,"startTime":"2021-02-15T17:53:34.9226682Z","userAgent":"BuildSvc/1.0.20210206.1","requestHttpMethod":null,"requestUri":null,"apiVersion":null,"version":"1.0.20210206.1","requestHeaders":{},"loggingProperties":{"applicationName":"BuildService","deviceOperatingSystem":"Linux 5.4.0-1031-azure #32~18.04.1-Ubuntu SMP Tue Oct 6 10:03:22 UTC 2020","framework":".NET Core 4.6.28325.01"}}</json>
2021-02-15T17:53:35.9076895Z | BuildService | TRACE | Download workspace
2021-02-15T17:53:35.9646040Z | BuildService | TRACE | Log Handler started
2021-02-15T17:53:36.1176640Z | BuildService | WARNG | DownloadWorkspaceAsync failed with {"Message":"The SSL connection could not be established, see inner exception.","Data":{},"InnerException":{"ClassName":"System.Security.Authentication.AuthenticationException","Message":"The remote certificate is invalid according to the validation procedure.","Data":null,"InnerException":null,"HelpURL":null,"StackTraceString":"   at System.Net.Security.SslState.StartSendAuthResetSignal(ProtocolToken message, AsyncProtocolRequest asyncRequest, ExceptionDispatchInfo exception)\n   at System.Net.Security.SslState.CheckCompletionBeforeNextReceive(ProtocolToken message, AsyncProtocolRequest asyncRequest)\n   at System.Net.Security.SslState.StartSendBlob(Byte[] incoming, Int32 count, AsyncProtocolRequest asyncRequest)\n   at System.Net.Security.SslState.ProcessReceivedBlob(Byte[] buffer, Int32 count, AsyncProtocolRequest asyncRequest)\n   at System.Net.Security.SslState.StartReadFrame(Byte[] buffer, Int32 readBytes, AsyncProtocolRequest asyncRequest)\n   at System.Net.Security.SslState.StartReceiveBlob(Byte[] buffer, AsyncProtocolRequest asyncRequest)\n   at System.Net.Security.SslState.CheckCompletionBeforeNextReceive(ProtocolToken message, AsyncProtocolRequest asyncRequest)\n   at System.Net.Security.SslState.StartSendBlob(Byte[] incoming, Int32 count, AsyncProtocolRequest asyncRequest)\n   at System.Net.Security.SslState.ProcessReceivedBlob(Byte[] buffer, Int32 count, AsyncProtocolRequest asyncRequest)\n   at System.Net.Security.SslState.StartReadFrame(Byte[] buffer, Int32 readBytes, AsyncProtocolRequest asyncRequest)\n   at System.Net.Security.SslState.StartReceiveBlob(Byte[] buffer, AsyncProtocolRequestasyncRequest)\n   at System.Net.Security.SslState.CheckCompletionBeforeNextReceive(ProtocolToken message, AsyncProtocolRequest asyncRequest)\n   at System.Net.Security.SslState.StartSendBlob(Byte[] incoming, Int32 count, AsyncProtocolRequest asyncRequest)\n   at System.Net.Security.SslState.ProcessReceivedBlob(Byte[] buffer, Int32 count, AsyncProtocolRequest asyncRequest)\n   at System.Net.Security.SslState.StartReadFrame(Byte[] buffer, Int32 readBytes, AsyncProtocolRequest asyncRequest)\n   at System.Net.Security.SslState.PartialFrameCallback(AsyncProtocolRequest asyncRequest)\n--- End of stack trace from previous location where exception was thrown ---\n   at System.Net.Security.SslState.ThrowIfExceptional()\n   at System.Net.Security.SslState.InternalEndProcessAuthentication(LazyAsyncResult lazyResult)\n   at System.Net.Security.SslState.EndProcessAuthentication(IAsyncResult result)\n   at System.Net.Security.SslStream.EndAuthenticateAsClient(IAsyncResult asyncResult)\n   at System.Net.Security.SslStream.<>c.<AuthenticateAsClientAsync>b__47_1(IAsyncResult iar)\n   at System.Threading.Tasks.TaskFactory`1.FromAsyncCoreLogic(IAsyncResult iar, Func`2 endFunction, Action`1 endAction, Task`1 promise, Boolean requiresSynchronization)\n--- End of stack trace from previous location where exception was thrown ---\n   at System.Net.Http.ConnectHelper.EstablishSslConnectionAsyncCore(Stream stream, SslClientAuthenticationOptions sslOptions, CancellationToken cancellationToken)","RemoteStackTraceString":null,"RemoteStackIndex":0,"ExceptionMethod":null,"HResult":-2146233087,"Source":"System.Private.CoreLib","WatsonBuckets":null},"StackTrace":"   at Microsoft.Rest.RetryDelegatingHandler.SendAsync(HttpRequestMessage request, CancellationToken cancellationToken)\n   at Microsoft.Azure.DevSpaces.Common.Auth.Handlers.AuthErrorHttpHandler.SendAsync(HttpRequestMessage request, CancellationToken cancellationToken) in /app/common.auth/Handlers/AuthErrorHttpHandler.cs:line 47\n   at Microsoft.Azure.DevSpaces.Common.Auth.Handlers.ServiceClientCredentialsHttpHandler`1.SendAsync(HttpRequestMessage request, CancellationToken cancellationToken) in /app/common.auth/Handlers/ServiceClientCredentialsHttpHandler.cs:line 34\nat System.Net.Http.HttpClient.FinishSendAsyncUnbuffered(Task`1 sendTask, HttpRequestMessage request, CancellationTokenSource cts, Boolean disposeCts)\n   at Microsoft.Azure.DevSpaces.ExecService.Client.WorkspaceOperations.DownloadSourceWithHttpMessagesAsync(String spaceName, String serviceName, String name, Dictionary`2 customHeaders, CancellationToken cancellationToken) in /app/execsvc.client.autogen/WorkspaceOperations.cs:line 161\n   at Microsoft.Azure.DevSpaces.ExecService.Client.WorkspaceOperationsExtensions.DownloadSourceAsync(IWorkspaceOperations operations, String spaceName, String serviceName, String name, CancellationToken cancellationToken) in /app/execsvc.client.autogen/WorkspaceOperationsExtensions.cs:line 49\n   at Microsoft.Azure.DevSpaces.Build.BuildClient.DownloadWorkspaceAsync() in /app/build/BuildClient.cs:line 154","HelpLink":null,"Source":"Microsoft.Rest.ClientRuntime","HResult":-2146233087}, retry.
2021-02-15T17:53:40.2424473Z | BuildService | ERROR | Logging handled exception: System.Net.Http.HttpRequestException: {"Message":"The SSL connection could not be established, see inner exception.","Data":{},"InnerException":{"ClassName":"System.Security.Authentication.AuthenticationException","Message":"The remote certificate is invalid according to the validation procedure.","Data":null,"InnerException":null,"HelpURL":null,"StackTraceString":"   at System.Net.Security.SslState.StartSendAuthResetSignal(ProtocolToken message, AsyncProtocolRequest asyncRequest, ExceptionDispatchInfo exception)\n   at System.Net.Security.SslState.CheckCompletionBeforeNextReceive(ProtocolToken message, AsyncProtocolRequest asyncRequest)\n   at System.Net.Security.SslState.StartSendBlob(Byte[] incoming, Int32 count, AsyncProtocolRequest asyncRequest)\n   at System.Net.Security.SslState.ProcessReceivedBlob(Byte[] buffer, Int32 count, AsyncProtocolRequest asyncRequest)\n   at System.Net.Security.SslState.StartReadFrame(Byte[] buffer, Int32 readBytes, AsyncProtocolRequest asyncRequest)\n   at System.Net.Security.SslState.StartReceiveBlob(Byte[] buffer, AsyncProtocolRequest asyncRequest)\n   at System.Net.Security.SslState.CheckCompletionBeforeNextReceive(ProtocolToken message, AsyncProtocolRequest asyncRequest)\n   at System.Net.Security.SslState.StartSendBlob(Byte[] incoming,Int32 count, AsyncProtocolRequest asyncRequest)\n   at System.Net.Security.SslState.ProcessReceivedBlob(Byte[] buffer, Int32 count, AsyncProtocolRequest asyncRequest)\n   at System.Net.Security.SslState.StartReadFrame(Byte[] buffer, Int32 readBytes, AsyncProtocolRequest asyncRequest)\n   at System.Net.Security.SslState.StartReceiveBlob(Byte[] buffer, AsyncProtocolRequest asyncRequest)\n   at System.Net.Security.SslState.CheckCompletionBeforeNextReceive(ProtocolToken message, AsyncProtocolRequest asyncRequest)\n   at System.Net.Security.SslState.StartSendBlob(Byte[] incoming, Int32 count, AsyncProtocolRequest asyncRequest)\n   at System.Net.Security.SslState.ProcessReceivedBlob(Byte[] buffer, Int32 count, AsyncProtocolRequest asyncRequest)\n   at System.Net.Security.SslState.StartReadFrame(Byte[] buffer, Int32 readBytes, AsyncProtocolRequest asyncRequest)\n   at System.Net.Security.SslState.PartialFrameCallback(AsyncProtocolRequest asyncRequest)\n--- End of stack trace from previous location where exception was thrown ---\n   at System.Net.Security.SslState.ThrowIfExceptional()\n   at System.Net.Security.SslState.InternalEndProcessAuthentication(LazyAsyncResult lazyResult)\n   at System.Net.Security.SslState.EndProcessAuthentication(IAsyncResult result)\n   at System.Net.Security.SslStream.EndAuthenticateAsClient(IAsyncResult asyncResult)\n   at System.Net.Security.SslStream.<>c.<AuthenticateAsClientAsync>b__47_1(IAsyncResult iar)\n   at System.Threading.Tasks.TaskFactory`1.FromAsyncCoreLogic(IAsyncResult iar, Func`2 endFunction, Action`1 endAction, Task`1 promise, Boolean requiresSynchronization)\n--- End of stack trace from previous location where exception was thrown ---\n   at System.Net.Http.ConnectHelper.EstablishSslConnectionAsyncCore(Stream stream, SslClientAuthenticationOptions sslOptions, CancellationToken cancellationToken)","RemoteStackTraceString":null,"RemoteStackIndex":0,"ExceptionMethod":null,"HResult":-2146233087,"Source":"System.Private.CoreLib","WatsonBuckets":null},"StackTrace":"   at System.Net.Http.ConnectHelper.EstablishSslConnectionAsyncCore(Stream stream, SslClientAuthenticationOptions sslOptions, CancellationToken cancellationToken)\n   at System.Threading.Tasks.ValueTask`1.get_Result()\n   at System.Net.Http.HttpConnectionPool.CreateConnectionAsync(HttpRequestMessage request, CancellationToken cancellationToken)\n   at System.Threading.Tasks.ValueTask`1.get_Result()\n   at System.Net.Http.HttpConnectionPool.WaitForCreatedConnectionAsync(ValueTask`1 creationTask)\n   at System.Threading.Tasks.ValueTask`1.get_Result()\n   at System.Net.Http.HttpConnectionPool.SendWithRetryAsync(HttpRequestMessage request, Boolean doRequestAuth, CancellationToken cancellationToken)\n   at System.Net.Http.RedirectHandler.SendAsync(HttpRequestMessage request, CancellationToken cancellationToken)\n   at Microsoft.Azure.DevSpaces.Common.Auth.Handlers.ServiceClientCredentialsHttpHandler`1.SendAsync(HttpRequestMessage request, CancellationToken cancellationToken) in /app/common.auth/Handlers/ServiceClientCredentialsHttpHandler.cs:line 34\n  at System.Net.Http.HttpClient.FinishSendAsyncBuffered(Task`1 sendTask, HttpRequestMessage request, CancellationTokenSource cts, Boolean disposeCts)\n   at Microsoft.Azure.DevSpaces.Common.Logging.AzdsHttpTelemetryLogger.<>c__DisplayClass12_0.<<-ctor>b__0>d.MoveNext() in /app/common/Logging/Loggers/AzdsHttpTelemetryLogger.cs:line 44\n--- End of stack trace from previous location where exception was thrown ---\n   at System.Threading.ExecutionContext.RunInternal(ExecutionContext executionContext, ContextCallback callback, Object state)\n--- End of stack trace from previous location where exception was thrown ---\n   at System.Threading.ThreadPoolWorkQueue.Dispatch()","HelpLink":null,"Source":"System.Net.Http","HResult":-2146233087}
2021-02-15T17:53:40.2727271Z | BuildService | ERROR | Unhandled AppDomain exception! System.Net.Http.HttpRequestException: The SSL connection could not be established, seeinner exception.
2021-02-15T17:53:40.2731763Z | BuildService | ERROR | AppDomain base exception details: System.Security.Authentication.AuthenticationException: The remote certificate is invalid according to the validation procedure.
2021-02-15T17:53:40.2736504Z | BuildService | ERROR | Is AppDomain terminating? 'True'

Unhandled Exception: System.Net.Http.HttpRequestException: The SSL connection could not be established, see inner exception. ---> System.Security.Authentication.AuthenticationException: The remote certificate is invalid according to the validation procedure.
   at System.Net.Security.SslState.StartSendAuthResetSignal(ProtocolToken message, AsyncProtocolRequest asyncRequest, ExceptionDispatchInfo exception)
   at System.Net.Security.SslState.CheckCompletionBeforeNextReceive(ProtocolToken message, AsyncProtocolRequest asyncRequest)
   at System.Net.Security.SslState.StartSendBlob(Byte[] incoming, Int32 count, AsyncProtocolRequest asyncRequest)
   at System.Net.Security.SslState.ProcessReceivedBlob(Byte[] buffer, Int32 count, AsyncProtocolRequest asyncRequest)
   at System.Net.Security.SslState.StartReadFrame(Byte[] buffer, Int32 readBytes, AsyncProtocolRequest asyncRequest)
   at System.Net.Security.SslState.StartReceiveBlob(Byte[] buffer, AsyncProtocolRequest asyncRequest)
   at System.Net.Security.SslState.CheckCompletionBeforeNextReceive(ProtocolToken message, AsyncProtocolRequest asyncRequest)
   at System.Net.Security.SslState.StartSendBlob(Byte[] incoming, Int32 count, AsyncProtocolRequest asyncRequest)
   at System.Net.Security.SslState.ProcessReceivedBlob(Byte[] buffer, Int32 count, AsyncProtocolRequest asyncRequest)
   at System.Net.Security.SslState.StartReadFrame(Byte[] buffer, Int32 readBytes, AsyncProtocolRequest asyncRequest)
   at System.Net.Security.SslState.StartReceiveBlob(Byte[] buffer, AsyncProtocolRequest asyncRequest)
   at System.Net.Security.SslState.CheckCompletionBeforeNextReceive(ProtocolToken message, AsyncProtocolRequest asyncRequest)
   at System.Net.Security.SslState.StartSendBlob(Byte[] incoming, Int32 count, AsyncProtocolRequest asyncRequest)
   at System.Net.Security.SslState.ProcessReceivedBlob(Byte[] buffer, Int32 count, AsyncProtocolRequest asyncRequest)
   at System.Net.Security.SslState.StartReadFrame(Byte[] buffer, Int32 readBytes, AsyncProtocolRequest asyncRequest)
   at System.Net.Security.SslState.PartialFrameCallback(AsyncProtocolRequest asyncRequest)
--- End of stack trace from previous location where exception was thrown ---
   at System.Net.Security.SslState.ThrowIfExceptional()
   at System.Net.Security.SslState.InternalEndProcessAuthentication(LazyAsyncResult lazyResult)
   at System.Net.Security.SslState.EndProcessAuthentication(IAsyncResult result)
   at System.Net.Security.SslStream.EndAuthenticateAsClient(IAsyncResult asyncResult)
   at System.Net.Security.SslStream.<>c.<AuthenticateAsClientAsync>b__47_1(IAsyncResult iar)
   at System.Threading.Tasks.TaskFactory`1.FromAsyncCoreLogic(IAsyncResult iar, Func`2 endFunction, Action`1 endAction, Task`1 promise, Boolean requiresSynchronization)
--- End of stack trace from previous location where exception was thrown ---
   at System.Net.Http.ConnectHelper.EstablishSslConnectionAsyncCore(Stream stream, SslClientAuthenticationOptions sslOptions, CancellationToken cancellationToken)
   --- End of inner exception stack trace ---
   at System.Net.Http.ConnectHelper.EstablishSslConnectionAsyncCore(Stream stream, SslClientAuthenticationOptions sslOptions, CancellationToken cancellationToken)
   at System.Threading.Tasks.ValueTask`1.get_Result()
   at System.Net.Http.HttpConnectionPool.CreateConnectionAsync(HttpRequestMessage request, CancellationToken cancellationToken)
   at System.Threading.Tasks.ValueTask`1.get_Result()
   at System.Net.Http.HttpConnectionPool.WaitForCreatedConnectionAsync(ValueTask`1 creationTask)
   at System.Threading.Tasks.ValueTask`1.get_Result()
   at System.Net.Http.HttpConnectionPool.SendWithRetryAsync(HttpRequestMessage request, Boolean doRequestAuth, CancellationToken cancellationToken)
   at System.Net.Http.RedirectHandler.SendAsync(HttpRequestMessage request, CancellationToken cancellationToken)
   at Microsoft.Azure.DevSpaces.Common.Auth.Handlers.ServiceClientCredentialsHttpHandler`1.SendAsync(HttpRequestMessage request, CancellationToken cancellationToken) in /app/common.auth/Handlers/ServiceClientCredentialsHttpHandler.cs:line 34
   at System.Net.Http.HttpClient.FinishSendAsyncBuffered(Task`1 sendTask, HttpRequestMessage request, CancellationTokenSource cts, Boolean disposeCts)
   at Microsoft.Azure.DevSpaces.Common.Logging.AzdsHttpTelemetryLogger.<>c__DisplayClass12_0.<<-ctor>b__0>d.MoveNext() in /app/common/Logging/Loggers/AzdsHttpTelemetryLogger.cs:line 44
--- End of stack trace from previous location where exception was thrown ---
   at System.Threading.ExecutionContext.RunInternal(ExecutionContext executionContext, ContextCallback callback, Object state)
--- End of stack trace from previous location where exception was thrown ---
   at System.Threading.ThreadPoolWorkQueue.Dispatch()

from dev-spaces.

amsoedal avatar amsoedal commented on July 24, 2024

@Eneuman Thanks for attaching this! Can you try again now?

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024

@amsoedal Yes, now its working again :) Thanks!

from dev-spaces.

amsoedal avatar amsoedal commented on July 24, 2024

Great! Happy to hear it :)

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024

@amsoedal Hi again. It looks like this problem has started to occure again :(
The logs shows the same problem as above.

Can you please apply your fix again?

azds up also gives me this error:

Oops... An unexpected error has occurred. A report of the error will be sent to Microsoft.
For diagnostic information, see Azure Dev Spaces logs at 'C:\Users\PerBornsjö\AppData\Local\Temp\Azure Dev Spaces'.
Please include the following Client Request ID when contacting support: 01779312-4e11-4199-a2b5-b8ed5de999cb

And the log contains this, so it looks like that SSL error again.

2021-02-21T16:26:34.2473765Z | CLI | WARNG | Logging handled exception: Microsoft.Azure.DevSpaces.Client.Exceptions.OperationIdException: {"RequestId":null,"ClientRequestId":"01779312-4e11-4199-a2b5-b8ed5de999cb","CorrelationRequestId":null,"Request":null,"Response":null,"Format":"The SSL connection could not be established, see inner exception.","Args":[],"Message":"The SSL connection could not be established, see inner exception.","Data":{},"InnerException":{"Message":"The SSL connection could not be established, see inner exception.","Data":{},"InnerException":{"ClassName":"System.Security.Authentication.AuthenticationException","Message":"The remote certificate is invalid according to the validation procedure.","Data":null,"InnerException":null,"HelpURL":null,"StackTraceString":"   at System.Net.Security.SslState.StartSendAuthResetSignal(ProtocolToken message, AsyncProtocolRequest asyncRequest, ExceptionDispatchInfo exception)\r\n   at System.Net.Security.SslState.CheckCompletionBeforeNextReceive(ProtocolToken message, AsyncProtocolRequest asyncRequest)\r\n   at System.Net.Security.SslState.StartSendBlob(Byte[] incoming, Int32 count, AsyncProtocolRequest asyncRequest)\r\n   at System.Net.Security.SslState.ProcessReceivedBlob(Byte[] buffer, Int32 count, AsyncProtocolRequest asyncRequest)\r\n   at System.Net.Security.SslState.StartReadFrame(Byte[] buffer, Int32 readBytes, AsyncProtocolRequest asyncRequest)\r\n   at System.Net.Security.SslState.StartReceiveBlob(Byte[] buffer, AsyncProtocolRequest asyncRequest)\r\n   at System.Net.Security.SslState.CheckCompletionBeforeNextReceive(ProtocolToken message, AsyncProtocolRequest asyncRequest)\r\n   at System.Net.Security.SslState.StartSendBlob(Byte[] incoming, Int32 count, AsyncProtocolRequest asyncRequest)\r\n   at System.Net.Security.SslState.ProcessReceivedBlob(Byte[] buffer, Int32 count, AsyncProtocolRequest asyncRequest)\r\n   at System.Net.Security.SslState.StartReadFrame(Byte[] buffer, Int32 readBytes, AsyncProtocolRequest asyncRequest)\r\n   at System.Net.Security.SslState.StartReceiveBlob(Byte[] buffer, AsyncProtocolRequest asyncRequest)\r\n   at System.Net.Security.SslState.CheckCompletionBeforeNextReceive(ProtocolToken message, AsyncProtocolRequest asyncRequest)\r\n   at System.Net.Security.SslState.StartSendBlob(Byte[] incoming, Int32 count, AsyncProtocolRequest asyncRequest)\r\n   at System.Net.Security.SslState.ProcessReceivedBlob(Byte[] buffer, Int32 count, AsyncProtocolRequest asyncRequest)\r\n   at System.Net.Security.SslState.StartReadFrame(Byte[] buffer, Int32 readBytes, AsyncProtocolRequest asyncRequest)\r\n   at System.Net.Security.SslState.PartialFrameCallback(AsyncProtocolRequest asyncRequest)\r\n--- End of stack trace from previous location where exception was thrown ---\r\n   at System.Net.Security.SslState.InternalEndProcessAuthentication(LazyAsyncResult lazyResult)\r\n   at System.Net.Security.SslState.EndProcessAuthentication(IAsyncResult result)\r\n   at System.Net.Security.SslStream.EndAuthenticateAsClient(IAsyncResult asyncResult)\r\n   at System.Net.Security.SslStream.<>c.<AuthenticateAsClientAsync>b__47_1(IAsyncResult iar)\r\n   at System.Threading.Tasks.TaskFactory`1.FromAsyncCoreLogic(IAsyncResult iar, Func`2 endFunction, Action`1 endAction, Task`1 promise, Boolean requiresSynchronization)\r\n--- End of stack trace from previous location where exception was thrown ---\r\n   at System.Net.Http.ConnectHelper.EstablishSslConnectionAsyncCore(Stream stream, SslClientAuthenticationOptions sslOptions, CancellationToken cancellationToken)","RemoteStackTraceString":null,"RemoteStackIndex":0,"ExceptionMethod":null,"HResult":-2146233087,"Source":"System.Private.CoreLib","WatsonBuckets":null},"StackTrace":"   at Microsoft.Rest.RetryDelegatingHandler.SendAsync(HttpRequestMessage request, CancellationToken cancellationToken)\r\n   at Microsoft.Azure.DevSpaces.Common.Auth.Handlers.AuthErrorHttpHandler.SendAsync(HttpRequestMessage request, CancellationToken cancellationToken) in C:\\A\\1\\55\\s\\src\\common.auth\\Handlers\\AuthErrorHttpHandler.cs:line 47\r\n   at Microsoft.Azure.DevSpaces.Common.Logging.OperationContextRequestIdsHttpHandler.SendAsync(HttpRequestMessage request, CancellationToken cancellationToken) in C:\\A\\1\\55\\s\\src\\common\\Logging\\OperationContextRequestIdsHttpHandler.cs:line 31\r\n   at Microsoft.Azure.DevSpaces.Client.ServiceClients.Handlers.ClientOperationContextRequestIdsHttpHandler.SendAsync(HttpRequestMessage request, CancellationToken cancellationToken) in C:\\A\\1\\55\\s\\src\\client\\Client\\ServiceClients\\Handlers\\ClientOperationContextRequestIdsHttpHandler.cs:line 31\r\n   at Microsoft.Azure.DevSpaces.Client.ServiceClients.Handlers.UnsupportedRegionErrorHttpHandler.SendAsync(HttpRequestMessage request, CancellationToken cancellationToken) in C:\\A\\1\\55\\s\\src\\client\\Client\\ServiceClients\\Handlers\\UnsupportedRegionErrorHttpHandler.cs:line 22\r\n   at Microsoft.Azure.DevSpaces.Client.ServiceClients.Handlers.ApiVersionCheckerHttpHandler.SendAsync(HttpRequestMessage request, CancellationToken cancellationToken) in C:\\A\\1\\55\\s\\src\\client\\Client\\ServiceClients\\Handlers\\ApiVersionCheckerHttpHandler.cs:line 26\r\n   at Microsoft.Azure.DevSpaces.Common.Auth.Handlers.ServiceClientCredentialsHttpHandler`1.SendAsync(HttpRequestMessage request, CancellationToken cancellationToken) in C:\\A\\1\\55\\s\\src\\common.auth\\Handlers\\ServiceClientCredentialsHttpHandler.cs:line 34\r\n   at System.Net.Http.HttpClient.FinishSendAsyncBuffered(Task`1 sendTask, HttpRequestMessage request, CancellationTokenSource cts, Boolean disposeCts)\r\n   at Microsoft.Azure.DevSpaces.ExecService.Client.SpacesOperations.ListSpacesWithHttpMessagesAsync(Dictionary`2 customHeaders, CancellationToken cancellationToken) in C:\\A\\1\\55\\s\\src\\execsvc.client.autogen\\SpacesOperations.cs:line 590\r\n   at Microsoft.Azure.DevSpaces.Client.ServiceClients.ExecExceptionStrategy.RunWithHandlingAsync[T](Func`1 func, FailureConfig failureConfig) in C:\\A\\1\\55\\s\\src\\client\\Client\\ServiceClients\\ExceptionStrategies\\ExecExceptionStrategy.cs:line 59\r\n   at Microsoft.Azure.DevSpaces.Client.ManagementClients.SpaceManagementClientImplementation.ListSpacesAsync(CancellationToken cancellationToken) in C:\\A\\1\\55\\s\\src\\client\\Client\\ManagementClients\\SpaceManagementClientImplementation.cs:line 299\r\n   at Microsoft.Azure.DevSpaces.Client.Extensions.OwnedExtensions.<>c__DisplayClass2_0`2.<<TryRunOwnedOperationThenDisposeAsync>b__0>d.MoveNext() in C:\\A\\1\\55\\s\\src\\client\\Extensions\\OwnedExtensions.cs:line 62\r\n--- End of stack trace from previous location where exception was thrown ---\r\n   at Microsoft.Azure.DevSpaces.Client.Utilities.AutofacUtilities.TryRunWithErrorPropagationAsync[T](Func`1 func, ILog log, IOperationContext operationContext) in C:\\A\\1\\55\\s\\src\\client\\Utilities\\AutofacUtilities.cs:line 81","HelpLink":null,"Source":"Microsoft.Rest.ClientRuntime","HResult":-2146233087},"StackTrace":"   at Microsoft.Azure.DevSpaces.Client.Utilities.AutofacUtilities._Handle[T](Exception e, ILog log, IOperationContext operationContext, Boolean isRecurse) in C:\\A\\1\\55\\s\\src\\client\\Utilities\\AutofacUtilities.cs:line 163\r\n   at Microsoft.Azure.DevSpaces.Client.Utilities.AutofacUtilities.TryRunWithErrorPropagationAsync[T](Func`1 func, ILog log, IOperationContext operationContext) in C:\\A\\1\\55\\s\\src\\client\\Utilities\\AutofacUtilities.cs:line 87\r\n   at Microsoft.Azure.DevSpaces.Cli.Settings.DevSpacesSettingsManager.GetLocalSettingsAsync(CancellationToken cancellationToken) in C:\\A\\1\\55\\s\\src\\cli\\Settings\\DevSpacesSettingsManager.cs:line 124\r\n   at Microsoft.Azure.DevSpaces.Cli.AppContainerConfig.<>c.<BuildContainer>b__0_5(IComponentContext c) in C:\\A\\1\\55\\s\\src\\cli\\AppContainerConfig.cs:line 268\r\n   at Autofac.RegistrationExtensions.<>c__DisplayClass5_0`1.<Register>b__0(IComponentContext c, IEnumerable`1 p)\r\n   at Autofac.Builder.RegistrationBuilder.<>c__DisplayClass0_0`1.<ForDelegate>b__0(IComponentContext c, IEnumerable`1 p)\r\n   at Autofac.Core.Activators.Delegate.DelegateActivator.ActivateInstance(IComponentContext context, IEnumerable`1 parameters)\r\n   at Autofac.Core.Resolving.InstanceLookup.Activate(IEnumerable`1 parameters)","HelpLink":null,"Source":null,"HResult":-2146233088}
2021-02-21T16:26:34.2501562Z | CLI | TRACE | Event: Command.End <json>{"properties":{"arguments":"up","result":"Failed"},"metrics":{"duration":4066.0}}</json>

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024

Ater upgrading my kubernetes version, i'm now receiving this error message: This version of Kubernetes is no longer supported. Please upgrade your cluster to a supported Kubernetes version and retry when running azds up

Might be related to: #416

from dev-spaces.

rakeshvanga avatar rakeshvanga commented on July 24, 2024

@Eneuman Azure Dev Spaces is deprecated and is only supported till 1.18.* Kubernetes versions. After the upgrade what is the Kubernetes version of the cluster?
I'm looking into the SSL error and will update once it is resolved.

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024

@rakeshvanga Will I be able to install a AKS with version 1.18 all the way until 2023 when Dev Spaces is being deprecated?

from dev-spaces.

greenie-msft avatar greenie-msft commented on July 24, 2024

Hi @Eneuman ,

Yes, Dev Spaces will only support v1.18 in AKS until 2023. Dev Spaces is being retired and at this point, while it might remain used for legacy reasons, it should not be used on new clusters.

We understand some limitations prevented you and your team from moving to Bridge to Kubernetes and we are actively working on those scenarios to unblock you. Bridge to Kubernetes now supports debugging a service backed by multiple replicas and will soon support managed identities. I expect we'll have an experience for you to try in the next few weeks. Once both of those features are live, we're happy to help you and your team transition from legacy Dev Spaces to our improved offering Bridge to Kubernetes.

Please let us know if you have any further questions. Thank you.

from dev-spaces.

rakeshvanga avatar rakeshvanga commented on July 24, 2024

@Eneuman, I've fixed the SSL error on our side, can you try now?

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024

@rakeshakkera Hi
The SSL error is unfortently back :( Can you please take a look?

2021-02-24T11:31:38.5161606Z | BuildService | TRACE | Starting build process.\nOperation context: <json>{"clientRequestId":"a588e693-177b-4c98-bcc9-6f18bef6f738","correlationRequestId":"2d811ee3-ab40-4465-8483-c3854ac93da0","requestId":null,"userSubscriptionId":null,"startTime":"2021-02-24T11:31:37.7190095Z","userAgent":"BuildSvc/1.0.20210206.1","requestHttpMethod":null,"requestUri":null,"apiVersion":null,"version":"1.0.20210206.1","requestHeaders":{},"loggingProperties":{"applicationName":"BuildService","deviceOperatingSystem":"Linux 5.4.0-1039-azure #41~18.04.1-Ubuntu SMP Mon Jan 18 14:00:01 UTC 2021","framework":".NET Core 4.6.28325.01"}}</json>
2021-02-24T11:31:38.5240909Z | BuildService | TRACE | Download workspace
2021-02-24T11:31:38.5578023Z | BuildService | TRACE | Log Handler started
2021-02-24T11:31:38.7225955Z | BuildService | WARNG | DownloadWorkspaceAsync failed with {"Message":"The SSL connection could not be established, see inner exception.","Data":{},"InnerException":{"ClassName":"System.Security.Authentication.AuthenticationException","Message":"The remote certificate is invalid according to the validation procedure.","Data":null,"InnerException":null,"HelpURL":null,"StackTraceString":"   at System.Net.Security.SslState.StartSendAuthResetSignal(ProtocolToken message, AsyncProtocolRequest asyncRequest, ExceptionDispatchInfo exception)\n   at System.Net.Security.SslState.CheckCompletionBeforeNextReceive(ProtocolToken message, AsyncProtocolRequest asyncRequest)\n   at System.Net.Security.SslState.StartSendBlob(Byte[] incoming, Int32 count, AsyncProtocolRequest asyncRequest)\n   at System.Net.Security.SslState.ProcessReceivedBlob(Byte[] buffer, Int32 count, AsyncProtocolRequest asyncRequest)\n   at System.Net.Security.SslState.StartReadFrame(Byte[] buffer, Int32 readBytes, AsyncProtocolRequest asyncRequest)\n   at System.Net.Security.SslState.StartReceiveBlob(Byte[] buffer, AsyncProtocolRequest asyncRequest)\n   at System.Net.Security.SslState.CheckCompletionBeforeNextReceive(ProtocolToken message, AsyncProtocolRequest asyncRequest)\n   at System.Net.Security.SslState.StartSendBlob(Byte[] incoming, Int32 count, AsyncProtocolRequest asyncRequest)\n   at System.Net.Security.SslState.ProcessReceivedBlob(Byte[] buffer, Int32 count, AsyncProtocolRequest asyncRequest)\n   at System.Net.Security.SslState.StartReadFrame(Byte[] buffer, Int32 readBytes, AsyncProtocolRequest asyncRequest)\n   at System.Net.Security.SslState.StartReceiveBlob(Byte[] buffer, AsyncProtocolRequest asyncRequest)\n   at System.Net.Security.SslState.CheckCompletionBeforeNextReceive(ProtocolToken message, AsyncProtocolRequest asyncRequest)\n   at System.Net.Security.SslState.StartSendBlob(Byte[] incoming, Int32 count, AsyncProtocolRequest asyncRequest)\n   at System.Net.Security.SslState.ProcessReceivedBlob(Byte[] buffer, Int32 count, AsyncProtocolRequest asyncRequest)\n   at System.Net.Security.SslState.StartReadFrame(Byte[] buffer, Int32 readBytes, AsyncProtocolRequest asyncRequest)\n   at System.Net.Security.SslState.PartialFrameCallback(AsyncProtocolRequest asyncRequest)\n--- End of stack trace from previous location where exception was thrown ---\n   at System.Net.Security.SslState.ThrowIfExceptional()\n   at System.Net.Security.SslState.InternalEndProcessAuthentication(LazyAsyncResult lazyResult)\n   at System.Net.Security.SslState.EndProcessAuthentication(IAsyncResult result)\n   at System.Net.Security.SslStream.EndAuthenticateAsClient(IAsyncResult asyncResult)\n   at System.Net.Security.SslStream.<>c.<AuthenticateAsClientAsync>b__47_1(IAsyncResult iar)\n   at System.Threading.Tasks.TaskFactory`1.FromAsyncCoreLogic(IAsyncResult iar, Func`2 endFunction, Action`1 endAction, Task`1 promise, Boolean requiresSynchronization)\n--- End of stack trace from previous location where exception was thrown ---\n   at System.Net.Http.ConnectHelper.EstablishSslConnectionAsyncCore(Stream stream, SslClientAuthenticationOptions sslOptions, CancellationToken cancellationToken)","RemoteStackTraceString":null,"RemoteStackIndex":0,"ExceptionMethod":null,"HResult":-2146233087,"Source":"System.Private.CoreLib","WatsonBuckets":null},"StackTrace":"   at Microsoft.Rest.RetryDelegatingHandler.SendAsync(HttpRequestMessage request, CancellationToken cancellationToken)\n   at Microsoft.Azure.DevSpaces.Common.Auth.Handlers.AuthErrorHttpHandler.SendAsync(HttpRequestMessage request, CancellationToken cancellationToken) in /app/common.auth/Handlers/AuthErrorHttpHandler.cs:line 47\n   at Microsoft.Azure.DevSpaces.Common.Auth.Handlers.ServiceClientCredentialsHttpHandler`1.SendAsync(HttpRequestMessage request, CancellationToken cancellationToken) in /app/common.auth/Handlers/ServiceClientCredentialsHttpHandler.cs:line 34\n   at System.Net.Http.HttpClient.FinishSendAsyncUnbuffered(Task`1 sendTask, HttpRequestMessage request, CancellationTokenSource cts, Boolean disposeCts)\n   at Microsoft.Azure.DevSpaces.ExecService.Client.WorkspaceOperations.DownloadSourceWithHttpMessagesAsync(String spaceName, String serviceName, String name, Dictionary`2 customHeaders, CancellationToken cancellationToken) in /app/execsvc.client.autogen/WorkspaceOperations.cs:line 161\n   at Microsoft.Azure.DevSpaces.ExecService.Client.WorkspaceOperationsExtensions.DownloadSourceAsync(IWorkspaceOperations operations, String spaceName, String serviceName, String name, CancellationToken cancellationToken) in /app/execsvc.client.autogen/WorkspaceOperationsExtensions.cs:line 49\n   at Microsoft.Azure.DevSpaces.Build.BuildClient.DownloadWorkspaceAsync() in /app/build/BuildClient.cs:line 154","HelpLink":null,"Source":"Microsoft.Rest.ClientRuntime","HResult":-2146233087}, retry.

from dev-spaces.

amsoedal avatar amsoedal commented on July 24, 2024

@Eneuman Restarted the components again -- is it working now? Also I'll make sure we bump up the priority on getting this sorted -- no idea why this issue is affecting West Europe so frequently 😕

from dev-spaces.

daniv-msft avatar daniv-msft commented on July 24, 2024

@Eneuman > We now have beta bits to try to pod identity/managed identity. As discussed previously, would you be willing to give them a try?
(Tagging @pragyamehta who worked on this on our side)

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024

@amsoedal It seems like we have a problem again. Pods are giving us theese errors:

pods/klinikportalservice-858c9b7c7-lhvlq: BackOff: Back-off restarting failed container

from dev-spaces.

amsoedal avatar amsoedal commented on July 24, 2024

@Eneuman are there any logs or events on the pod? We did do a release recently but it shouldn't have changed any functionality to the controller ☹

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024

@amsoedal
pods/klinikportalservice-75588487b4-vtwp4: BackOff: Back-off restarting failed container
Oops... An unexpected error has occurred. A report of the error will be sent to Microsoft.
For diagnostic information, see Azure Dev Spaces logs at 'C:\Users\PerBornsjö\AppData\Local\Temp\Azure Dev Spaces'.
Please include the following Client Request ID when contacting support: ab587787-70e2-4d20-bbe6-d69adbfd786c

2021-03-12T11:22:21.5408289Z | CLI | WARNG | Logging handled exception: Microsoft.Azure.DevSpaces.Client.Exceptions.OperationIdException: {"RequestId":null,"ClientRequestId":"ab587787-70e2-4d20-bbe6-d69adbfd786c","CorrelationRequestId":null,"Request":null,"Response":null,"Format":"The SSL connection could not be established, see inner exception.","Args":[],"Message":"The SSL connection could not be established, see inner exception.","Data":{},"InnerException":{"Message":"The SSL connection could not be established, see inner exception.","Data":{},"InnerException":{"ClassName":"System.Security.Authentication.AuthenticationException","Message":"The remote certificate is invalid according to the validation procedure.","Data":null,"InnerException":null,"HelpURL":null,"StackTraceString":" at System.Net.Security.SslState.StartSendAuthResetSignal(ProtocolToken message, AsyncProtocolRequest asyncRequest, ExceptionDispatchInfo exception)\r\n at System.Net.Security.SslState.CheckCompletionBeforeNextReceive(ProtocolToken message, AsyncProtocolRequest asyncRequest)\r\n at System.Net.Security.SslState.StartSendBlob(Byte[] incoming, Int32 count, AsyncProtocolRequest asyncRequest)\r\n at System.Net.Security.SslState.ProcessReceivedBlob(Byte[] buffer, Int32 count, AsyncProtocolRequest asyncRequest)\r\n at System.Net.Security.SslState.StartReadFrame(Byte[] buffer, Int32 readBytes, AsyncProtocolRequest asyncRequest)\r\n at System.Net.Security.SslState.StartReceiveBlob(Byte[] buffer, AsyncProtocolRequest asyncRequest)\r\n at System.Net.Security.SslState.CheckCompletionBeforeNextReceive(ProtocolToken message, AsyncProtocolRequest asyncRequest)\r\n at System.Net.Security.SslState.StartSendBlob(Byte[] incoming, Int32 count, AsyncProtocolRequest asyncRequest)\r\n at System.Net.Security.SslState.ProcessReceivedBlob(Byte[] buffer, Int32 count, AsyncProtocolRequest asyncRequest)\r\n at System.Net.Security.SslState.StartReadFrame(Byte[] buffer, Int32 readBytes, AsyncProtocolRequest asyncRequest)\r\n at System.Net.Security.SslState.StartReceiveBlob(Byte[] buffer, AsyncProtocolRequest asyncRequest)\r\n at System.Net.Security.SslState.CheckCompletionBeforeNextReceive(ProtocolToken message, AsyncProtocolRequest asyncRequest)\r\n at System.Net.Security.SslState.StartSendBlob(Byte[] incoming, Int32 count, AsyncProtocolRequest asyncRequest)\r\n at System.Net.Security.SslState.ProcessReceivedBlob(Byte[] buffer, Int32 count, AsyncProtocolRequest asyncRequest)\r\n at System.Net.Security.SslState.StartReadFrame(Byte[] buffer, Int32 readBytes, AsyncProtocolRequest asyncRequest)\r\n at System.Net.Security.SslState.PartialFrameCallback(AsyncProtocolRequest asyncRequest)\r\n--- End of stack trace from previous location where exception was thrown ---\r\n at System.Net.Security.SslState.InternalEndProcessAuthentication(LazyAsyncResult lazyResult)\r\n at System.Net.Security.SslState.EndProcessAuthentication(IAsyncResult result)\r\n at System.Net.Security.SslStream.EndAuthenticateAsClient(IAsyncResult asyncResult)\r\n at System.Net.Security.SslStream.<>c.b__47_1(IAsyncResult iar)\r\n at System.Threading.Tasks.TaskFactory1.FromAsyncCoreLogic(IAsyncResult iar, Func2 endFunction, Action1 endAction, Task1 promise, Boolean requiresSynchronization)\r\n--- End of stack trace from previous location where exception was thrown ---\r\n at System.Net.Http.ConnectHelper.EstablishSslConnectionAsyncCore(Stream stream, SslClientAuthenticationOptions sslOptions, CancellationToken cancellationToken)","RemoteStackTraceString":null,"RemoteStackIndex":0,"ExceptionMethod":null,"HResult":-2146233087,"Source":"System.Private.CoreLib","WatsonBuckets":null},"StackTrace":" at System.Net.Http.ConnectHelper.EstablishSslConnectionAsyncCore(Stream stream, SslClientAuthenticationOptions sslOptions, CancellationToken cancellationToken)\r\n at System.Threading.Tasks.ValueTask1.get_Result()\r\n at System.Net.Http.HttpConnectionPool.CreateConnectionAsync(HttpRequestMessage request, CancellationToken cancellationToken)\r\n at System.Threading.Tasks.ValueTask1.get_Result()\r\n at System.Net.Http.HttpConnectionPool.WaitForCreatedConnectionAsync(ValueTask1 creationTask)\r\n at System.Net.Http.HttpConnectionPool.SendWithRetryAsync(HttpRequestMessage request, Boolean doRequestAuth, CancellationToken cancellationToken)\r\n at System.Net.Http.RedirectHandler.SendAsync(HttpRequestMessage request, CancellationToken cancellationToken)\r\n at Microsoft.Azure.DevSpaces.Common.Logging.OperationContextRequestIdsHttpHandler.SendAsync(HttpRequestMessage request, CancellationToken cancellationToken) in C:\\A\\1\\55\\s\\src\\common\\Logging\\OperationContextRequestIdsHttpHandler.cs:line 45\r\n at Microsoft.Azure.DevSpaces.Client.ServiceClients.Handlers.ClientOperationContextRequestIdsHttpHandler.SendAsync(HttpRequestMessage request, CancellationToken cancellationToken) in C:\\A\\1\\55\\s\\src\\client\\Client\\ServiceClients\\Handlers\\ClientOperationContextRequestIdsHttpHandler.cs:line 48\r\n at Microsoft.Azure.DevSpaces.Client.ServiceClients.Handlers.UnsupportedRegionErrorHttpHandler.SendAsync(HttpRequestMessage request, CancellationToken cancellationToken) in C:\\A\\1\\55\\s\\src\\client\\Client\\ServiceClients\\Handlers\\UnsupportedRegionErrorHttpHandler.cs:line 45\r\n at Microsoft.Azure.DevSpaces.Client.ServiceClients.Handlers.ApiVersionCheckerHttpHandler.SendAsync(HttpRequestMessage request, CancellationToken cancellationToken) in C:\\A\\1\\55\\s\\src\\client\\Client\\ServiceClients\\Handlers\\ApiVersionCheckerHttpHandler.cs:line 62\r\n at Microsoft.AspNetCore.Http.Connections.Client.Internal.AccessTokenHttpMessageHandler.SendAsync(HttpRequestMessage request, CancellationToken cancellationToken)\r\n at Microsoft.AspNetCore.Http.Connections.Client.Internal.LoggingHttpMessageHandler.SendAsync(HttpRequestMessage request, CancellationToken cancellationToken)\r\n at System.Net.Http.HttpClient.FinishSendAsyncUnbuffered(Task1 sendTask, HttpRequestMessage request, CancellationTokenSource cts, Boolean disposeCts)\r\n at Microsoft.AspNetCore.Http.Connections.Client.Internal.SendUtils.SendMessages(Uri sendUrl, IDuplexPipe application, HttpClient httpClient, ILogger logger, CancellationToken cancellationToken)\r\n at System.IO.Pipelines.PipeCompletion.ThrowLatchedException()\r\n at System.IO.Pipelines.Pipe.GetReadResult(ReadResult& result)\r\n at System.IO.Pipelines.Pipe.GetReadAsyncResult()\r\n at System.Threading.Tasks.ValueTask1.get_Result()\r\n at Microsoft.AspNetCore.SignalR.Client.HubConnection.ReceiveLoop(ConnectionState connectionState)\r\n at System.Threading.Channels.AsyncOperation1.GetResult(Int16 token)\r\n at System.Threading.Tasks.ValueTask1.get_Result()\r\n at Microsoft.AspNetCore.SignalR.Client.HubConnectionExtensions.<>c__DisplayClass57_01.<g__RunChannel|0>d.MoveNext()\r\n--- End of stack trace from previous location where exception was thrown ---\r\n at System.Threading.Channels.AsyncOperation1.GetResult(Int16 token)\r\n at Microsoft.Azure.DevSpaces.Client.ServiceClients.ExecSignalRServerProxy._RunAsync[T](Func1 func, CancellationToken cancellationToken, String methodName) in C:\A\1\55\s\src\client\Client\ServiceClients\ExecSignalRServerProxy.cs:line 104\r\n at Microsoft.Azure.DevSpaces.Client.ServiceClients.ExecSignalRServiceClient.<>c__DisplayClass29_01.<<AutoRunChannelStreamUntilCanceledAsync>b__3>d.MoveNext() in C:\\A\\1\\55\\s\\src\\client\\Client\\ServiceClients\\ExecSignalRServiceClient.cs:line 402\r\n--- End of stack trace from previous location where exception was thrown ---\r\n at Microsoft.Azure.DevSpaces.Client.ServiceClients.ExecSignalRServiceClient.RunAsync(String operationDescription, Func1 func, CancellationToken cancellationToken, Nullable1 maxWaitForSuccess, Boolean tolerateNotFoundResponses) in C:\\A\\1\\55\\s\\src\\client\\Client\\ServiceClients\\ExecSignalRServiceClient.cs:line 227\r\n at Microsoft.Azure.DevSpaces.Client.ServiceClients.ExecSignalRServiceClient.AutoRunChannelStreamUntilCanceledAsync[T](String operationName, Func1 startStreamCallback, Action1 itemHandler, IProgressReporter progressReporter, TimeSpan maxWaitForConnectionSuccess, CancellationToken cancellationToken, Boolean useCatchUp, Boolean tolerateNotFoundResponses, Func2 streamCompletedCallback, Action firstConnectionEstablishedCallback) in C:\A\1\55\s\src\client\Client\ServiceClients\ExecSignalRServiceClient.cs:line 407\r\n at Microsoft.Azure.DevSpaces.Client.ServiceClients.ExecExceptionStrategy.RunWithHandlingAsync[T](Func1 func, FailureConfig failureConfig) in C:\\A\\1\\55\\s\\src\\client\\Client\\ServiceClients\\ExceptionStrategies\\ExecExceptionStrategy.cs:line 59\r\n at Microsoft.Azure.DevSpaces.Client.ManagementClients.ServiceManagementClientImplementation.GetBuildLogsAsync(String chartName, String requestId, IProgressReporter progressReporter, CancellationToken cancellationToken) in C:\\A\\1\\55\\s\\src\\client\\Client\\ManagementClients\\ServiceManagementClientImplementation.cs:line 748\r\n at Microsoft.Azure.DevSpaces.Client.ManagementClients.ServiceManagementClientImplementation.DeployReleaseInnerAsync(IServiceConfig serviceConfig, DeployReleaseOption deployReleaseOption, String cwd, IProgressReporter progressReporter, IPerformanceLogger perfLogger, CancellationToken cancellationToken) in C:\\A\\1\\55\\s\\src\\client\\Client\\ManagementClients\\ServiceManagementClientImplementation.cs:line 545\r\n at Microsoft.Azure.DevSpaces.Client.ManagementClients.ServiceManagementClientImplementation.DeployAzdsReleaseAsync(IServiceConfig serviceConfig, DeployReleaseOption deployReleaseOption, IProgress1 progress, CancellationToken cancellationToken, Boolean skipSyncCode) in C:\A\1\55\s\src\client\Client\ManagementClients\ServiceManagementClientImplementation.cs:line 224\r\n at Microsoft.Azure.DevSpaces.Client.Extensions.OwnedExtensions.<>c__DisplayClass2_02.<<TryRunOwnedOperationThenDisposeAsync>b__0>d.MoveNext() in C:\\A\\1\\55\\s\\src\\client\\Extensions\\OwnedExtensions.cs:line 62\r\n--- End of stack trace from previous location where exception was thrown ---\r\n at Microsoft.Azure.DevSpaces.Client.Utilities.AutofacUtilities.TryRunWithErrorPropagationAsync[T](Func1 func, ILog log, IOperationContext operationContext) in C:\A\1\55\s\src\client\Utilities\AutofacUtilities.cs:line 81","HelpLink":null,"Source":"System.Net.Http","HResult":-2146233087},"StackTrace":" at Microsoft.Azure.DevSpaces.Client.Utilities.AutofacUtilities._Handle[T](Exception e, ILog log, IOperationContext operationContext, Boolean isRecurse) in C:\A\1\55\s\src\client\Utilities\AutofacUtilities.cs:line 163\r\n at Microsoft.Azure.DevSpaces.Client.Utilities.AutofacUtilities.TryRunWithErrorPropagationAsync[T](Func`1 func, ILog log, IOperationContext operationContext) in C:\A\1\55\s\src\client\Utilities\AutofacUtilities.cs:line 87\r\n at Microsoft.Azure.DevSpaces.Cli.Commands.Service.ServiceUpCommand.ExecuteAsync() in C:\A\1\55\s\src\cli\Commands\Service\ServiceUpCommand.cs:line 233\r\n at Microsoft.Azure.DevSpaces.Cli.DevSpacesCliApp.RunCommandAsync(String[] args, CancellationToken cancellationToken) in C:\A\1\55\s\src\cli\DevSpacesCliApp.cs:line 190\r\n at Microsoft.Azure.DevSpaces.Cli.DevSpacesCliApp.ExecuteAsync(String[] args, CancellationToken cancellationToken) in C:\A\1\55\s\src\cli\DevSpacesCliApp.cs:line 149","HelpLink":null,"Source":null,"HResult":-2146233088}
2021-03-12T11:22:21.5443083Z | CLI | TRACE | Event: Command.End {"properties":{"arguments":"up","result":"Failed"},"metrics":{"duration":468956.0}}

from dev-spaces.

amsoedal avatar amsoedal commented on July 24, 2024

@Eneuman OK thanks. I restarted our components, could you please see if it works now? We made the change to restart them periodically without manual intervention, but perhaps 24 hours is too big of an interval

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024

@amsoedal Seems like the problem is back, but now I also reecived this:

pods/klinikportalservice-844fb67df4-2ct56: BackOff: Back-off restarting failed container
Cancelling...
Timed out after 14m trying to start build logs streaming operation.

from dev-spaces.

amsoedal avatar amsoedal commented on July 24, 2024

@Eneuman our components restarted overnight, can you try the operation now?

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024

@amsoedal Some services can be installed with azds up but about 50% gives us timeouts or other strange errors :(

from dev-spaces.

amsoedal avatar amsoedal commented on July 24, 2024

@Eneuman that's very strange :( Can you send me your log files?

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024

@amsoedal
Here is a new error for us:
Azds up for failed: Helm install failed with exit code '1': Error: UPGRADE FAILED: query: failed to query with labels: Get https://aks-we-eclinic-dev-dns-a2826232.hcp.westeurope.azmk8s.io:443/api/v1/namespaces/dev/secrets?labelSelector=name%3Dazds-b75d66-dev-organisationpersonalservice%2Cowner%3Dhelm%2Cstatus%3Ddeployed: dial tcp 51.138.35.185:443: i/o timeoutFailed to install Helm chart! 52s

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024

And we are also receiving this:
Timed out after 13s trying to start deploy release streaming operation.
and also
Timed out after 2m trying to list azds pods.

from dev-spaces.

amsoedal avatar amsoedal commented on July 24, 2024

@Eneuman that's very strange... I'll take a look at the logs on my side. This might be a naive question, but is "helm install" for your chart working outside of an AZDS context?

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024

@amsoedal I have not used them outside of azds up so I can't say. But azds up has not been a problem before and nothing has changes in the service.

from dev-spaces.

amsoedal avatar amsoedal commented on July 24, 2024

@Eneuman I restarted the components again and looking at our backend logs in parallel, I see a lot of helm timeout exceptions as well. Not sure what is causing it. Let me know if restarting changed anything

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024

@amsoedal After a few retries they all got deployed :)
But is seems like there are some performance issues. The deployments take alot of time.

from dev-spaces.

amsoedal avatar amsoedal commented on July 24, 2024

@Eneuman I'm glad you got unblocked! That gives us some more time to investigate hopefully

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024

@amsoedal Hi again. About 10 minutes ago it stoped working again. This time we receive this error message: This version of Kubernetes is no longer supported. Please upgrade your cluster to a supported Kubernetes version and retry.
after waiting for 5 min, the error went away. It's working now.

I'm beginning to see a pattern with theese outages.
Our main project has 3700 files in it. Suddenly it decied that it wants to synchonize all thoose files at once.
This works okey the first time but after that, synchonization seems to start having performance issues.
I don't know how dev spaces handles synchonizations on your end but if it tries to handle each file in a seperate thread (or to many at once) this maby could be a case of thread starvation that leads to deadlocks. Just a thought.

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024

@amsoedal Hi.
I seems like the services are stuck again. Can you please restart them ?

from dev-spaces.

daniv-msft avatar daniv-msft commented on July 24, 2024

Adding @rakeshvanga who is looking at issues this week, as @amsoedal isn't available.

from dev-spaces.

rakeshvanga avatar rakeshvanga commented on July 24, 2024

@Eneuman I've restarted our components. Can you try now?

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024

@rakeshvanga Hi
It's working again :)
Thanks!

from dev-spaces.

rakeshvanga avatar rakeshvanga commented on July 24, 2024

Sure, I will update once I'm done.

from dev-spaces.

rakeshvanga avatar rakeshvanga commented on July 24, 2024

@Eneuman, I've restarted the pods. Can you try now?

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024

@rakeshvanga
Hi
It just crashed again, can you restart it plz ? :)

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024

I have attached todays crash logs if you want to take alook and see if you can figure out whats going on :)
2021-04-01T08-07-10.4498504.log
2021-04-01T07-20-03.4750117.log
2021-04-01T15-59-25.5680363.log

One thing I have notised is that If I make a change to a .scss file and have webcompiler compile it, then dev spaces wants to synchornize all files in my project (all 3617 of them) and after that bad things starts to happen.

Updating 3617 files in container...
2m 0s
Update completed in 5m 54s.

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024

@daniv-msft It looks like the services are stuck again. Can you please ask someone to restart them ? :)

Here are the logs:
2021-04-08T07:11:36.9106299Z | CLI | ERROR | Deploy release request failed with: Couldn't deploy chart klinikportalservice.\n
2021-04-08T07:11:38.4275086Z | CLI | ERROR | Dependency: Service Run - Port Forward {"target":null,"success":false,"duration":null,"properties":{"requestId":"9af523ba-b247-4861-8601-e6fdf63ef2e4","clientRequestId":"7f0e9d99-f7c8-404e-8324-82dda1fd9385","correlationRequestId":"null"}}
2021-04-08T07:11:38.4322413Z | CLI | ERROR | Oops... An unexpected error has occurred. A report of the error will be sent to Microsoft.\nFor diagnostic information, see Azure Dev Spaces logs at 'C:\Users\PerBornsjö\AppData\Local\Temp\Azure Dev Spaces'.\n
2021-04-08T07:11:38.4331263Z | CLI | ERROR | Please include the following Request ID when contacting support: 9af523ba-b247-4861-8601-e6fdf63ef2e4\n
2021-04-08T07:11:38.4601262Z | CLI | WARNG | Logging handled exception: Microsoft.Azure.DevSpaces.Client.Exceptions.OperationIdException: {"RequestId":"9af523ba-b247-4861-8601-e6fdf63ef2e4","ClientRequestId":"7f0e9d99-f7c8-404e-8324-82dda1fd9385","CorrelationRequestId":null,"Request":null,"Response":null,"Format":"Couldn't deploy chart klinikportalservice.","Args":[],"Message":"Couldn't deploy chart klinikportalservice.","Data":{},"InnerException":{"ClassName":"Microsoft.Azure.DevSpaces.Common.Exceptions.ExecInternalServerErrorException","Message":"Couldn't deploy chart klinikportalservice.","Data":null,"InnerException":{"ClassName":"Microsoft.AspNetCore.SignalR.HubException","Message":"An error occurred on the server while streaming results. ExecInternalServerErrorException: Couldn't deploy chart klinikportalservice.","Data":null,"InnerException":null,"HelpURL":null,"StackTraceString":" at System.Threading.Channels.AsyncOperation1.GetResult(Int16 token)\r\n at System.Threading.Tasks.ValueTask1.get_Result()\r\n at Microsoft.AspNetCore.SignalR.Client.HubConnectionExtensions.<>c__DisplayClass57_01.<<StreamAsChannelCoreAsync>g__RunChannel|0>d.MoveNext()\r\n--- End of stack trace from previous location where exception was thrown ---\r\n at System.Threading.Channels.AsyncOperation1.GetResult(Int16 token)\r\n at System.Threading.Tasks.ValueTask1.get_Result()\r\n at Microsoft.Azure.DevSpaces.Client.ServiceClients.ExecSignalRServerProxy._RunAsync[T](Func1 func, CancellationToken cancellationToken, String methodName) in C:\A\1\55\s\src\client\Client\ServiceClients\ExecSignalRServerProxy.cs:line 104","RemoteStackTraceString":null,"RemoteStackIndex":0,"ExceptionMethod":null,"HResult":-2146233088,"Source":"System.Private.CoreLib","WatsonBuckets":null},"HelpURL":null,"StackTraceString":" at Microsoft.Azure.DevSpaces.Client.ServiceClients.ExecSignalRServerProxy._RunAsync[T](Func1 func, CancellationToken cancellationToken, String methodName) in C:\\A\\1\\55\\s\\src\\client\\Client\\ServiceClients\\ExecSignalRServerProxy.cs:line 113\r\n at Microsoft.Azure.DevSpaces.Client.ServiceClients.ExecSignalRServiceClient.<>c__DisplayClass29_01.<b__3>d.MoveNext() in C:\A\1\55\s\src\client\Client\ServiceClients\ExecSignalRServiceClient.cs:line 402\r\n--- End of stack trace from previous location where exception was thrown ---\r\n at Microsoft.Azure.DevSpaces.Client.ServiceClients.ExecSignalRServiceClient.RunAsync(String operationDescription, Func1 func, CancellationToken cancellationToken, Nullable1 maxWaitForSuccess, Boolean tolerateNotFoundResponses) in C:\A\1\55\s\src\client\Client\ServiceClients\ExecSignalRServiceClient.cs:line 227\r\n at Microsoft.Azure.DevSpaces.Client.ServiceClients.ExecSignalRServiceClient.AutoRunChannelStreamUntilCanceledAsync[T](String operationName, Func1 startStreamCallback, Action1 itemHandler, IProgressReporter progressReporter, TimeSpan maxWaitForConnectionSuccess, CancellationToken cancellationToken, Boolean useCatchUp, Boolean tolerateNotFoundResponses, Func2 streamCompletedCallback, Action firstConnectionEstablishedCallback) in C:\\A\\1\\55\\s\\src\\client\\Client\\ServiceClients\\ExecSignalRServiceClient.cs:line 407\r\n at Microsoft.Azure.DevSpaces.Client.ServiceClients.ExecExceptionStrategy.RunWithHandlingAsync[T](Func1 func, FailureConfig failureConfig) in C:\A\1\55\s\src\client\Client\ServiceClients\ExceptionStrategies\ExecExceptionStrategy.cs:line 59\r\n at Microsoft.Azure.DevSpaces.Client.ManagementClients.ServiceManagementClientImplementation.DeployReleaseInnerAsync(IServiceConfig serviceConfig, DeployReleaseOption deployReleaseOption, String cwd, IProgressReporter progressReporter, IPerformanceLogger perfLogger, CancellationToken cancellationToken) in C:\A\1\55\s\src\client\Client\ManagementClients\ServiceManagementClientImplementation.cs:line 510\r\n at Microsoft.Azure.DevSpaces.Client.ManagementClients.ServiceManagementClientImplementation.DeployAzdsReleaseAsync(IServiceConfig serviceConfig, DeployReleaseOption deployReleaseOption, IProgress1 progress, CancellationToken cancellationToken, Boolean skipSyncCode) in C:\\A\\1\\55\\s\\src\\client\\Client\\ManagementClients\\ServiceManagementClientImplementation.cs:line 224\r\n at Microsoft.Azure.DevSpaces.Client.Extensions.OwnedExtensions.<>c__DisplayClass2_02.<b__0>d.MoveNext() in C:\A\1\55\s\src\client\Extensions\OwnedExtensions.cs:line 62\r\n--- End of stack trace from previous location where exception was thrown ---\r\n at Microsoft.Azure.DevSpaces.Client.Utilities.AutofacUtilities.TryRunWithErrorPropagationAsync[T](Func1 func, ILog log, IOperationContext operationContext) in C:\\A\\1\\55\\s\\src\\client\\Utilities\\AutofacUtilities.cs:line 81","RemoteStackTraceString":null,"RemoteStackIndex":0,"ExceptionMethod":null,"HResult":-2146233088,"Source":"Microsoft.Azure.DevSpaces.Client","WatsonBuckets":null},"StackTrace":" at Microsoft.Azure.DevSpaces.Client.Utilities.AutofacUtilities._Handle[T](Exception e, ILog log, IOperationContext operationContext, Boolean isRecurse) in C:\\A\\1\\55\\s\\src\\client\\Utilities\\AutofacUtilities.cs:line 163\r\n at Microsoft.Azure.DevSpaces.Client.Utilities.AutofacUtilities.TryRunWithErrorPropagationAsync[T](Func1 func, ILog log, IOperationContext operationContext) in C:\A\1\55\s\src\client\Utilities\AutofacUtilities.cs:line 87\r\n at Microsoft.Azure.DevSpaces.Cli.Commands.Service.ServiceUpCommand.ExecuteAsync() in C:\A\1\55\s\src\cli\Commands\Service\ServiceUpCommand.cs:line 233\r\n at Microsoft.Azure.DevSpaces.Cli.DevSpacesCliApp.RunCommandAsync(String[] args, CancellationToken cancellationToken) in C:\A\1\55\s\src\cli\DevSpacesCliApp.cs:line 190\r\n at Microsoft.Azure.DevSpaces.Cli.DevSpacesCliApp.ExecuteAsync(String[] args, CancellationToken cancellationToken) in C:\A\1\55\s\src\cli\DevSpacesCliApp.cs:line 149","HelpLink":null,"Source":null,"HResult":-2146233088}
2021-04-08T07:11:38.4628803Z | CLI | TRACE | Event: Command.End {"properties":{"arguments":"up","result":"Failed"},"metrics":{"duration":109514.0}}

from dev-spaces.

amsoedal avatar amsoedal commented on July 24, 2024

Hi @Eneuman, I restarted the usual component, can you see if it's working now?

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024

@amsoedal Hi
It's still stuck at: Waiting for container image build.... and it never starts and timeouts after 10 min :(

from dev-spaces.

amsoedal avatar amsoedal commented on July 24, 2024

@Eneuman hm.... might be something else to do with connectivity to the controller. :( I restarted a different component this time, could you try it now?

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024

@amsoedal Still not working :(

from dev-spaces.

amsoedal avatar amsoedal commented on July 24, 2024

It seems like we're not able to build the service for some reason, or the service pod isn't coming up correctly. I see on the backend lots of logs like "still no user pods found in Ready state". Could you try doing an "up", and in a separate window doing kubectl get po -n <your space> --watch? And if you see any pods crashing, do you see any events when you do kubectl describe on the pod?

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024

It seems like the pods crashes with the error ImageNeverPull.

image

from dev-spaces.

amsoedal avatar amsoedal commented on July 24, 2024

@Eneuman thanks. What do you see when you run "kubectl describe <name of crashing pod>" in a different window under the "Events" section?

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024

Normal Scheduled 9m24s default-scheduler Successfully assigned dev/organisationpersonalservice-67644fc776-hhjnw to aks-agentpool3-99437877-vmss000000
Warning ErrImageNeverPull 7m13s (x13 over 9m23s) kubelet Container image "organisationpersonalservice:devspaces-x0178b13ba1de9152" is not present with pull policy of Never
Warning Failed 4m18s (x27 over 9m23s) kubelet Error: ErrImageNeverPull

I also keep getting this error:
Waiting for container image build...Oops... An unexpected error has occurred. A report of the error will be sent to Microsoft.
For diagnostic information, see Azure Dev Spaces logs at 'C:\Users\PerBornsjö\AppData\Local\Temp\Azure Dev Spaces'.
Please include the following Request ID when contacting support: 597cfa4b-4912-4713-9f8b-1916c3fd0cb5

from dev-spaces.

amsoedal avatar amsoedal commented on July 24, 2024

Ok thanks for adding these details & thanks for your patience. I'm still not totally sure what's going on, but one possibility is that we're failing to build the docker image for your service for some reason so it's not present locally. Could you please zip your %Temp%\Azure Dev Spaces files and send them to me at [email protected]? I'll let you know as soon as I have an update to share.

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024

Thanks. You should have them in your inbox now.

from dev-spaces.

amsoedal avatar amsoedal commented on July 24, 2024

@Eneuman thanks so much. One last question, have any changes been made to your Dockerfile.develop recently?

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024

Hi. No changes has been made to the docker.developer file for about 2 month.
The service was working fine about 1 week ago.

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024

Ok thanks for adding these details & thanks for your patience. I'm still not totally sure what's going on, but one possibility is that we're failing to build the docker image for your service for some reason so it's not present locally. Could you please zip your %Temp%\Azure Dev Spaces files and send them to me at [email protected]? I'll let you know as soon as I have an update to share.

Normaly I can see in the console window that it starts to build my docker image, but now I don't get any indication that it actually has started to build it.

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024

@amsoedal Anything else you want me to try? We are kinda stuck atm :(

from dev-spaces.

amsoedal avatar amsoedal commented on July 24, 2024

@Eneuman sorry no updates yet, but one of my US colleagues is now helping to look at the issue. Working to get to the bottom of this

from dev-spaces.

amsoedal avatar amsoedal commented on July 24, 2024

@Eneuman our best guess right now is that something on your cluster is preventing the container image from getting built. I also see that in the screenshot mentioned here:

It seems like the pods crashes with the error ImageNeverPull.

image

There are some pods that have been terminating for 25m. Are they from a previous "azds up", or were they deployed another way? Are there many pods that are stuck in terminating state on the cluster?

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024

Just the pods that I have been trying to do a azds up for.

I tried to create a new service and start it but it also gets the same errors:
testapplication-6dcd88956b-fpsrz 0/1 Pending 0 0s
testapplication-6dcd88956b-fpsrz 0/1 Pending 0 0s
testapplication-6dcd88956b-fpsrz 0/1 ContainerCreating 0 0s
testapplication-6dcd88956b-fpsrz 0/1 ErrImageNeverPull 0 1s

What image is it trying to pull?

What happens between synchronization of my files and the build?

I don't see any init containers getting started. Can this be the problem?

from dev-spaces.

daniv-msft avatar daniv-msft commented on July 24, 2024

@Eneuman Thanks for your reply. The way it works is that we inject an init container in the user pod on your cluster, that will build the docker image needed based on your code. This is why the image pull policy of the user pod is set to "Never": we never want to pull the image remotely, it should always be pulled locally. However, the reason why this image isn't built is unclear.

You mentioned that no init containers are started. This could indeed be the problem. Can you see one called/using as docker image "devspaces-build" or "azds-build-service"?

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024

@daniv-msft
This is what the pods running looks like:
image

This is what the pods we tried to build today looks like:
image

I think the problem is the init containers does not start.
Webhook problem?

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024

kubectl logs azds-webhook-deployment-6f5f6f596-z9h2l --namespace azds shows this?

azdslogger: main.go:133: Error flushing logs: adal: Refresh request failed. Status Code = '401'. Response body: {"error":"invalid_client","error_description":"AADSTS7000222: The provided client secret keys are expired. Visit the Azure Portal to create new keys for your app, or consider using certificate credentials for added security: https://docs.microsoft.com/azure/active-directory/develop/active-directory-certificate-credentials\r\nTrace ID: 3688cb74-cddf-416a-a2b5-283177cb2200\r\nCorrelation ID: 765fc0ec-9bb8-44c1-9c32-c3cf80a90b50\r\nTimestamp: 2021-04-08 16:11:37Z","error_codes":[7000222],"timestamp":"2021-04-08 16:11:37Z","trace_id":"3688cb74-cddf-416a-a2b5-283177cb2200","correlation_id":"765fc0ec-9bb8-44c1-9c32-c3cf80a90b50","error_uri":"https://login.microsoftonline.com/error?code=7000222"}

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024

Webhooks pods logs are full of errors :(

from dev-spaces.

daniv-msft avatar daniv-msft commented on July 24, 2024

Thanks @Eneuman for sharing these, it's helpful. We're looking into them and getting back to you once we understand better what could explain this.

from dev-spaces.

daniv-msft avatar daniv-msft commented on July 24, 2024

Also, could you please let us know if you can get any logs from the devspaces-build container?
Finally, could you please clarify what made you look into azds-webhook-deployment? I'm curious to know if you saw a log somewhere that would point to that.

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024

I can't find a devspaces-build container, but isn't that one that the webhook admission server also creates?

What made me look there was the fact that no build gets started and Kubernetes tries to use my "raw" charts to pull in a image that does not exist.

This points to a problem with weebhook not injecting init or build containers.

Do you have access to the logs or should I mail them to you?
It's clear that for some reason the token is expired and is not getting renewed.
Log says the token was created 2021-04-01T17:59:04Z. At 2021-04-08T07:02:46Z, we started to get 401

from dev-spaces.

amsoedal avatar amsoedal commented on July 24, 2024

@Eneuman have the certs for the cluster been rotated recently? Could you please try running azds controller refresh-credentials?

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024

@amsoedal We havn't rotated them manually. I'm not sure if AKS does this by itself.

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024

@amsoedal Didn't work, still getting theese errors:
azdslogger: main.go:133: Error flushing logs: adal: Refresh request failed. Status Code = '401'. Response body: {"error":"invalid_client","error_description":"AADSTS7000222: The provided client secret keys are expired. Visit the Azure Portal to create new keys for your app, or consider using certificate credentials for added security: https://docs.microsoft.com/azure/active-directory/develop/active-directory-certificate-credentials\r\nTrace ID: 55e05563-e9ea-4000-a43d-8558017e2600\r\nCorrelation ID: 8c63e8ad-e03d-424a-aec3-d8ffa582a80d\r\nTimestamp: 2021-04-08 16:53:41Z","error_codes":[7000222],"timestamp":"2021-04-08 16:53:41Z","trace_id":"55e05563-e9ea-4000-a43d-8558017e2600","correlation_id":"8c63e8ad-e03d-424a-aec3-d8ffa582a80d","error_uri":"https://login.microsoftonline.com/error?code=7000222"}

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024

I guess I can recreate the weekhook pods and that would probobly do it, but I don't want to do that until you got all debug information you need :)

from dev-spaces.

amsoedal avatar amsoedal commented on July 24, 2024

@Eneuman thank you :) I think we have enough info, please do try recreating the webhook pods. If that doesn't do anything, recreating the controller is probably the next step

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024

I deleted the pods but after they were created it had this in the log:

azdslogger: main.go:459: Getting standard service principal token

azdslogger: main.go:478: adal: Refresh request failed. Status Code = '401'. Response body: {"error":"invalid_client","error_description":"AADSTS7000222: The provided client secret keys are expired. Visit the Azure Portal to create new keys for your app, or consider using certificate credentials for added security: https://docs.microsoft.com/azure/active-directory/develop/active-directory-certificate-credentials\r\nTrace ID: 34da58a1-29da-4208-996b-43fb6f122200\r\nCorrelation ID: b0174168-a54b-4890-8452-92430b625434\r\nTimestamp: 2021-04-08 17:00:07Z","error_codes":[7000222],"timestamp":"2021-04-08 17:00:07Z","trace_id":"34da58a1-29da-4208-996b-43fb6f122200","correlation_id":"b0174168-a54b-4890-8452-92430b625434","error_uri":"https://login.microsoftonline.com/error?code=7000222"}

So I guess this is not going to work

from dev-spaces.

Eneuman avatar Eneuman commented on July 24, 2024

Can this have something to do with us running 2FA on our AD domain ?

from dev-spaces.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.