Giter Club home page Giter Club logo

azure-functions-java-library's Introduction

Azure Functions Logo

Branch Status
dev Build Status
v2.x Build Status

Library for Azure Java Functions

This repo contains library for building Azure Java Functions. Visit the complete documentation of Azure Functions - Java Developer Guide for more details.

The dev branch will be used to make any changes necessary to support v4 extension bundle.

The v2.x branch will be used to make any changes necessary to support v3 extension bundle.

azure-functions-maven plugin

How to use azure-functions-maven plugin to create, update, deploy and test azure java functions

Prerequisites

  • Java 8

Parent POM

Please see for details on Parent POM https://github.com/Microsoft/maven-java-parent

Summary

Azure Functions is a solution for easily running small pieces of code, or "functions," in the cloud. You can write just the code you need for the problem at hand, without worrying about a whole application or the infrastructure to run it. Functions can make development even more productive.Pay only for the time your code runs and trust Azure to scale as needed. Azure Functions lets you develop serverless applications on Microsoft Azure.

Azure Functions supports triggers, which are ways to start execution of your code, and bindings, which are ways to simplify coding for input and output data. A function should be a stateless method to process input and produce output. Although you are allowed to write instance methods, your function must not depend on any instance fields of the class. You need to make sure all the function methods are public accessible and method with annotation @FunctionName is unique as that defines the entry for the the function.

A deployable unit is an uber JAR containing one or more functions (see below), and a JSON file with the list of functions and triggers definitions, deployed to Azure Functions. The JAR can be created in many ways, although we recommend Azure Functions Maven Plugin, as it provides templates to get you started with key scenarios.

All the input and output bindings can be defined in function.json (not recommended), or in the Java method by using annotations (recommended). All the types and annotations used in this document are included in the azure-functions-java-library package.

Sample

Here is an example of a HttpTrigger Azure function in Java:

package com.example;

import com.microsoft.azure.functions.annotation.*;

public class Function {
    @FunctionName("echo")
    public static String echo(@HttpTrigger(name = "req", methods = { HttpMethod.POST }, authLevel = AuthorizationLevel.ANONYMOUS) String in) {
        return "Hello, " + in + ".";
    }
}

Adding 3rd Party Libraries

Azure Functions supports the use of 3rd party libraries. If using the Maven plugin for Azure Functions, all of your dependencies specified in your pom.xml file will be automatically bundled during the mvn package step.

Data Types

You are free to use all the data types in Java for the input and output data, including native types; customized POJO types and specialized Azure types defined in this API. Azure Functions runtime will try its best to convert the actual input value to the type you need (for example, a String input will be treated as a JSON string and be parsed to a POJO type defined in your code).

JSON Support

The POJO types (Java classes) you may define have to be publicly accessible (public modifier). POJO properties/fields may be private. For example a JSON string { "x": 3 } is able to be converted to the following POJO type:

public class PojoData {
    private int x;
}

Other supported types

Binary data is represented as byte[] or Byte[] in your Azure functions code. And make sure you specify dataType = "binary" in the corresponding triggers/bindings.

Empty input values could be null as your functions argument, but a recommended way to deal with potential empty values is to use Optional<T> type.

Inputs

Inputs are divided into two categories in Azure Functions: one is the trigger input and the other is the additional input. Trigger input is the input who triggers your function. And besides that, you may also want to get inputs from other sources (like a blob), that is the additional input.

Let's take the following code snippet as an example:

package com.example;

import com.microsoft.azure.functions.annotation.*;

public class Function {
    @FunctionName("echo")
    public String echo(
        @HttpTrigger(name = "req", methods = { HttpMethod.PUT }, authLevel = AuthorizationLevel.ANONYMOUS, route = "items/{id}") String in,
        @TableInput(name = "item", tableName = "items", partitionKey = "example", rowKey = "{id}", connection = "AzureWebJobsStorage") TestInputData inputData
    ) {
        return "Hello, " + in + " and " + inputData.getRowKey() + ".";
    }

}

public class TestInputData {
    public String getRowKey() { return this.rowKey; }
    private String rowKey;
}

When this function is invoked, the HTTP request payload will be passed as the String for argument in; and one entry will be retrieved from the Azure Table Storage and be passed to argument inputData as TestInputData type.

To receive events in a batch when using EventHubTrigger, set cardinality to many and change input type to an array or List<>

@FunctionName("ProcessIotMessages")
    public void processIotMessages(
        @EventHubTrigger(name = "message", eventHubName = "%AzureWebJobsEventHubPath%", connection = "AzureWebJobsEventHubSender", cardinality = Cardinality.MANY) List<TestEventData> messages,
        final ExecutionContext context)
    {
        context.getLogger().info("Java Event Hub trigger received messages. Batch size: " + messages.size());
    }
    
    public class TestEventData {
    public String id;
}

Note: You can also bind to String[], TestEventData[] or List

Outputs

Outputs can be expressed in return value or output parameters. If there is only one output, you are recommended to use the return value. For multiple outputs, you have to use output parameters.

Return value is the simplest form of output, you just return the value of any type, and Azure Functions runtime will try to marshal it back to the actual type (such as an HTTP response). You could apply any output annotations to the function method (the name property of the annotation has to be $return) to define the return value output.

For example, a blob content copying function could be defined as the following code. @StorageAccount annotation is used here to prevent the duplicating of the connection property for both @BlobTrigger and @BlobOutput.

package com.example;

import com.microsoft.azure.functions.annotation.*;

public class Function {
    @FunctionName("copy")
    @StorageAccount("AzureWebJobsStorage")
    @BlobOutput(name = "$return", path = "samples-output-java/{name}")
    public String copy(@BlobTrigger(name = "blob", path = "samples-input-java/{name}") String content) {
        return content;
    }
}

To produce multiple output values, use OutputBinding<T> type defined in the azure-functions-java-library package. If you need to make an HTTP response and push a message to a queue, you can write something like:

package com.example;

import com.microsoft.azure.functions.*;
import com.microsoft.azure.functions.annotation.*;

public class Function {
    @FunctionName("push")
    public String push(
        @HttpTrigger(name = "req", methods = { HttpMethod.POST }, authLevel = AuthorizationLevel.ANONYMOUS) String body,
        @QueueOutput(name = "message", queueName = "myqueue", connection = "AzureWebJobsStorage") OutputBinding<String> queue
    ) {
        queue.setValue("This is the queue message to be pushed");
        return "This is the HTTP response content";
    }
}

Use OutputBinding<byte[]> type to make a binary output value (for parameters); for return values, just use byte[].

Execution Context

You interact with Azure Functions execution environment via the ExecutionContext object defined in the azure-functions-java-library package. You are able to get the invocation ID, the function name and a built-in logger (which is integrated prefectly with Azure Function Portal experience as well as AppInsights) from the context object.

What you need to do is just add one more ExecutionContext typed parameter to your function method. Let's take a timer triggered function as an example:

package com.example;

import com.microsoft.azure.functions.*;
import com.microsoft.azure.functions.annotation.*;

public class Function {
    @FunctionName("heartbeat")
    public static void heartbeat(
        @TimerTrigger(name = "schedule", schedule = "*/30 * * * * *") String timerInfo,
        ExecutionContext context
    ) {
        context.getLogger().info("Heartbeat triggered by " + context.getFunctionName());
    }
}

Specialized Data Types

HTTP Request and Response

Sometimes a function need to take a more detailed control of the input and output, and that's why we also provide some specialized types in the azure-functions-java-library package for you to manipulate:

Specialized Type Target Typical Usage
HttpRequestMessage<T> HTTP Trigger Get method, headers or queries
HttpResponseMessage HTTP Output Binding Return status other than 200

Metadata

Metadata comes from different sources, like HTTP headers, HTTP queries, and trigger metadata. You can use @BindingName annotation together with the metadata name to get the value.

For example, the queryValue in the following code snippet will be "test" if the requested URL is http://{example.host}/api/metadata?name=test.

package com.example;

import java.util.Optional;
import com.microsoft.azure.functions.annotation.*;

public class Function {
    @FunctionName("metadata")
    public static String metadata(
        @HttpTrigger(name = "req", methods = { HttpMethod.GET, HttpMethod.POST }, authLevel = AuthorizationLevel.ANONYMOUS) Optional<String> body,
        @BindingName("name") String queryValue
    ) {
        return body.orElse(queryValue);
    }
}

License

This project is under the benevolent umbrella of the .NET Foundation and is licensed under the MIT License.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact [email protected] with any additional questions or comments.

azure-functions-java-library's People

Contributors

amamounelsayed avatar arroyc avatar brunoborges avatar cgillum avatar jainharsh98 avatar jonathangiles avatar kaibocai avatar kamperiadis avatar microsoft-github-policy-service[bot] avatar microsoftopensource avatar mohitp930 avatar msftgits avatar pragnagopa avatar shreyas-gopalakrishna avatar shrohilla avatar tsuyoshiushio avatar vrdmr avatar yojagad avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

azure-functions-java-library's Issues

Consider Combining Java Library and Worker Projects as Multi-Module Maven Project

The two projects (java-library and java-worker) are directly linked to each other given the dependency that worker has on the library.

With that said, I hereby propose the two projects source code are unified into a single repository, yet released independently.

This will facilitate enhancements that impact both projects at the same time, and the release of major versions altogether while still allowing independent releases of minor versions.

The approach is to use a Maven Multi Module project.

Catching uncaught exceptions

My question is two-folded:

  1. I would like to know if there is a notion of pre-execution and post-execution hooks, meaning:
    when a function is invoked, are there hooks exposed that will be called before and after the function handler is invoked?

  2. Is there a way by means of a post-hook to catch uncaught exceptions? If not, what alternative can I use?

For example, if I have a function handler like below (which is written in Kotlin), what abilities can I utilise to catch an exception if one occurs, that is not explicitly caught by a try/catch block?

class Function {

    fun foobar(name: String?, context: ExecutionContext): String {

        // mimicking an uncaught exception
        throw Exception()

        return "Foobar"
    }
}

Event Hubs bindings does not honor http status code

@functionName("HttpTrigger-EmailOutBound")
@EventHubOutput(name = "message", eventHubName = "eventHubTest", connection = "AzureEventHubConnection")
public HttpResponseMessage run(
@HttpTrigger(name = "httpRequestMessage", methods = {HttpMethod.POST}, authLevel = AuthorizationLevel.ANONYMOUS)
HttpRequestMessage<Optional> httpRequestMessage,
final ExecutionContext context) {
return httpRequestMessage
.createResponseBuilder(HttpStatus.UNAUTHORIZED)
.body("unauthorized")
.build();
}

Expected OutPut : HttpStatus Code : 401
Actual : HttpStatusCode: 200

Output:

{
"method": "",
"query": {},
"statusCode": "401",
"headers": {},
"enableContentNegotiation": false,
"body": "unauthorized"
}

Even the response body has the statusCode 401 , the actual response has 200.

image

please document which artifact to pull from maven central

is it ...

a) com.microsoft.azure:azure-functions-java-core:1.0.0-beta-3
b) com.microsoft.azure.functions:azure-functions-java-library:1.2.2

or sth. else?

which one provides this package? "com.microsoft.azure.functions"
I just have this package "com.microsoft.azure.serverless.functions"

Could you please help me solving the confusion?

Thanks

Change design to a Method-level annotation model

Currently, triggers and bindings are defined at the parameter-level. Given the amount of config parameters that the annotations may require, or that the user may want to set, this approach often makes the code extremely hard to read.

Only one trigger is allowed on a (Java method) function, making Trigger a definitive candidate to be a Method annotation, instead of a method-parameter annotation.

Example of Cosmos DB trigger:

 @FunctionName("cosmosDBMonitor")
    public void cosmosDbProcessor(
        @CosmosDBTrigger(name = "items",
            databaseName = "ToDoList",
            collectionName = "Items",
            leaseCollectionName = "leases",
            reateLeaseCollectionIfNotExists = true,
            connectionStringSetting = "AzureCosmosDBConnection") String[] items,
            final ExecutionContext context ) {
                context.getLogger().info(items.length + "item(s) is/are changed.");
            }

In the example above, it is hard to quickly find where the actual method body starts.

A better approach would be to move the annotation to the method-level:

   @FunctionName("cosmosDBMonitor")
   @CosmosDBTrigger(name = "items", databaseName = "ToDoList",
                                    collectionName = "Items", leaseCollectionName = "leases",
                                    createLeaseCollectionIfNotExists = true, 
                                    connectionStringSetting = "AzureCosmosDBConnection",
                                    parameter = "items")
    public void cosmosDbProcessor(String[] items, final ExecutionContext context ) {
        context.getLogger().info(items.length + "item(s) is/are changed.");
    }

Moving the annotation to the method level, requires a new way to bind the trigger and the input object. This can be done in two ways:

  1. Add a new annotation parameter called parameter where the user defines the name of the method parameter that will take the input
  2. Add a new method-parameter annotation type to link the trigger with the input object.

The code snippet above covers option 1.

For option 2, an example would be:

   @FunctionName("cosmosDBMonitor")
   @CosmosDBTrigger(name = "items", databaseName = "ToDoList",
                                    collectionName = "Items", leaseCollectionName = "leases",
                                    createLeaseCollectionIfNotExists = true, 
                                    connectionStringSetting = "AzureCosmosDBConnection")
    public void cosmosDbProcessor(@TriggerObject String[] items, final ExecutionContext context ) {
        context.getLogger().info(items.length + "item(s) is/are changed.");
    }

In the above example for option 2, a new annotation called @TriggerObject is defined to bind the trigger with the method-parameter.

This structure provides two benefits:

  1. Prevents developers from attempting to add two triggers to the same method, and only finding out if this works or note after they try to run on Azure Functions (whether local or on Azure).
  2. Makes the function method body easier to read.

The same approach should be considered for other types: Bindings, and Outputs.

Request for feedback: @JonathanGiles, @pragnagopa, @asavaritayal, @eduardolaureano, @jeffhollan

where is output Java example

This issue is copied from MicrosoftDocs/azure-docs#17061.


I do not see output Java example is it supported or not?

Document Details
โš  Do not edit this section. It is required for docs.microsoft.com โžŸ GitHub issue linking.

ID: 6d35c032-eae7-ac4b-2ad3-32fd0cd2318f
Version Independent ID: 92b0a972-eabe-8048-36b1-a1cefb3acf43
Content: Azure Table storage bindings for Azure Functions
Content Source: articles/azure-functions/functions-bindings-storage-table.md
Service: azure-functions
GitHub Login: @ggailey777
Microsoft Alias: glenga

BindingName: Add support for EventHubTrigger Cardinality.MANY

Currently there is no example or maybe even a possability to get SystemProperties and Properties for messages from an Event Hub (IoT Hub) which is configured with Cardinality.MANY.

For example:

@BindingName(value = "Properties") Map<String, Object>[] properties,
@BindingName(value = "SystemProperties") Map<String, Object>[] systemProperties,
@EventHubTrigger(
    name = "message", eventHubName = "eventHubName", dataType = "", connection = "connection", consumerGroup = "consumerGroup",
    cardinality = Cardinality.MANY) String[] message,
final ExecutionContext context

will result in

Executed 'Functions.Test' (Failed, Id=xyz)
System.Private.CoreLib: Exception while executing function: Functions.Test. System.Private.CoreLib: Result: Failure
Exception: NullPointerException: 
Stack: java.lang.NullPointerException
    at com.microsoft.azure.functions.worker.binding.BindingDataStore.getTriggerMetatDataByName(BindingDataStore.java:54)
    at com.microsoft.azure.functions.worker.broker.ParameterResolver.resolve(ParameterResolver.java:62)
    at com.microsoft.azure.functions.worker.broker.ParameterResolver.resolve(ParameterResolver.java:42)
    at com.microsoft.azure.functions.worker.broker.JavaMethodExecutor.execute(JavaMethodExecutor.java:52)
    at com.microsoft.azure.functions.worker.broker.JavaFunctionBroker.invokeMethod(JavaFunctionBroker.java:51)
    at com.microsoft.azure.functions.worker.handler.InvocationRequestHandler.execute(InvocationRequestHandler.java:33)
    at com.microsoft.azure.functions.worker.handler.InvocationRequestHandler.execute(InvocationRequestHandler.java:10)
    at com.microsoft.azure.functions.worker.handler.MessageHandler.handle(MessageHandler.java:45)
    at com.microsoft.azure.functions.worker.JavaWorkerClient$StreamingMessagePeer.lambda$onNext$0(JavaWorkerClient.java:92)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)
.

The only working option is Cardinality.ONE:

@BindingName(value = "Properties") Map<String, Object> properties,
@BindingName(value = "SystemProperties") Map<String, Object> systemProperties,
@EventHubTrigger(
    name = "message", eventHubName = "eventHubName", dataType = "", connection = "connection", consumerGroup = "consumerGroup",
    cardinality = Cardinality.ONE) String message,
final ExecutionContext context

Reference: https://stackoverflow.com/a/56094438

Add samples for Event Grid binding

Provide Java Function code samples for -

  • Event Grid trigger

For reference, here is the corresponding C# code that needs to be converted to Java -

image

Add samples for HTTP Binding

Provide Java Function code samples for -

  • HTTP Trigger

  • HTTP Webhook

For reference, here is the corresponding C# code that needs to be converted to Java -

image

image

Add samples for Timer binding

Provide Java Function code samples for -

  • Timer Trigger

For reference, here is the corresponding C# code that needs to be converted to Java -

image

Request Headers not being injected into HttpRequestMessage

I have setup an Azure Function using the Java runtime locally and have published it to Azure. Everything has been fairly smooth so far with one caveat.
Inside my function body, I need to interact with the HTTP Request Headers. This is an easy enough task - just make a call to request.getHeaders() which should return a Map<String, String>.
Then I can just interact with the map in the regular way, e.g. calling .get(), passing it the name of the header I want.

However, any call to get a header from the Map returns null.

Here is my curl request to my function with the headers I expect to the see in the request.

*   Trying ::1...
* TCP_NODELAY set
* Connection failed
* connect to ::1 port 7071 failed: Connection refused
*   Trying 127.0.0.1...
* TCP_NODELAY set
* Connected to localhost (127.0.0.1) port 7071 (#0)
> GET /api/HttpTrigger-Java?company=boom&location=ok HTTP/1.1
> Host: localhost:7071
> User-Agent: curl/7.54.0
> Accept: */*
> CustomHeader: boom

And here is a picture showing the output of the getHeaders() call.
Screen Shot 2019-03-12 at 2 00 15 PM (2)

Can anybody offer some recommendations? Is there anything I have to do to make the request headers be injected into the getHeaders() variable?

No trigger binding specified for CustomBindings

Problem

I have an extension it works on .NET side. I create a Java implementation using CustomBinding,
I install the extension and execute Azure Functions with maven, then I encounter this issue.
What is missing my configuration?

KafkaTrigger-Java: No trigger binding specified. A function must have a trigger input binding.

Configuration

My Trigger's definition is like this.

@Target(ElementType.PARAMETER)
@Retention(RetentionPolicy.RUNTIME)
@CustomBinding(direction = "in", name = "kafkaEvents", type = "KafkaTriggerAttribute")
public @interface KafkaTrigger {

I tried to change the type for other type. the "KafkaTriggerAttirbute" is taken from the definition of IExtensionConfigProvider implementation of C#.

            // register our trigger binding provider
            var triggerBindingProvider = new KafkaTriggerAttributeBindingProvider(config, options, converterManager, nameResolver, loggerFactory);
            context.AddBindingRule<KafkaTriggerAttribute>()
                .BindToTrigger(triggerBindingProvider);

The function.json which is genareted by pacakge is

{
  "scriptFile" : "../kafka-function-1.0-SNAPSHOT.jar",
  "entryPoint" : "com.contoso.kafka.Function.run",
  "bindings" : [ {
    "type" : "KafkaTriggerAttribute",
    "direction" : "in",
    "name" : "kafkaEvents",
    "topic" : "pageviews",
    "consumerGroup" : "azfunc",
    "brokerList" : "LocalBroker"
  } ]
}

I also tried the same value of the type of C# function.json 's type. it was "kafkaTrigger" however, It doesn't work. It returns "[04/20/2019 18:27:40] KafkaTrigger-Java: The binding type(s) 'kafkaTrigger' are not registered. Please ensure the type is correct and the binding extension is installed." Any ideas?

Implementation of IExtensionConfigProvider

https://github.com/Azure/azure-functions-kafka-extension/blob/tsuyoshi/javaimpl/src/Microsoft.Azure.WebJobs.Extensions.Kafka/Config/KafkaExtensionConfigProvider.cs#L21

Java implementation

https://github.com/Azure/azure-functions-kafka-extension/blob/tsuyoshi/javaimpl/binding-library/java/src/main/java/com/microsoft/azure/functions/kafka/annotation/KafkaTrigger.java

@BlobTrigger long delay

Hi!

I've got a function with a BlobTrigger.
I copied a file at 11:25:13 into the blob. The function is trigged at 11:48:53.
So it took about 23 minutes to actually run the function...
I expected immediate execution. Is this perhabs configurable?

Regards,
Marco Mans

Blob trigger with large file results in timeout

I have a BlobTrigger Java function which I believe is timing out when deployed to Azure after 5 minutes because it takes too long for the function to complete. I say this because I see the first log method Executing 'Functions.JavaDocBlobWatcher' (Reason='New blob detected: incoming/output.zip', Id=0f54292e-460e-4c2b-8e10-f9489fe2f34b), but then I see a second message Timeout value of 00:05:00 exceeded by function 'Functions.JavaDocBlobWatcher' (Id: '0f54292e-460e-4c2b-8e10-f9489fe2f34b'). Initiating cancellation.

The file is 38.1 MB and I believe it is all loaded into the byte array that the BlobTrigger is associated with. From a brief bit of testing I believe that large blobs of data take a very long time to be sent into the function, and I wonder whether something other than a byte[] should be used (e.g. some kind of input stream).

HttpResponseMessage became not useable from version 1.1.0-beta4 to 1.1.0-beta5

API for http trigger changed breakingfrom 1.0.0-beta-4 to 1.0.0-beta-5

  1. HttpResponseMessage became none generic
  2. A HttpResponseMessage can now be generated using a builder from the request
  3. If you try to return that HttpResponseMessage, the runtime complains that it wants an output binding for the trigger.
  4. Documentation still talks about a HttpResponseMessage<T>

Error message

``CMD
[6/26/2018 10:27:09 AM] Host started (2071ms)
[6/26/2018 10:27:09 AM] Job host started
[6/26/2018 10:27:09 AM] The following 1 functions are in error:
[6/26/2018 10:27:09 AM] test: At least one binding must be declared.

Expected

Updated documentation on how to return http responses in a controlled way instead of returning a string or throw an Exception.

Sample code

The code below compiles with 1.100-beta-5, but the runtime moans about wanting an output binding:

UPDATE: if I change to just return a String and throw exceptions in case of errors it still wants a an output binding.

 
  @FunctionName("test")
  public HttpResponseMessage test(//
      @HttpTrigger(//
          name = "req", //
          methods = {HttpMethod.GET},
          authLevel = AuthorizationLevel.FUNCTION) //
      final HttpRequestMessage<String> req, //
      final ExecutionContext context) {

    requireNonNull(context, "Context should not be null in " + CREATE_TEST_MY_SESSION_METHOD_NAME);
    // creating a session
    final String sessionID = createRandomSessionID();
    final Logger logger = prepareLogger(context, environment, sessionID);
    logEntering(logger, this.getClass(), CREATE_TEST_MY_SESSION_METHOD_NAME);
    requireNonNull(req, "Req should not be null in " + CREATE_TEST_MY_SESSION_METHOD_NAME);

    final CompletableFuture<HttpResponseMessage> completableFuture = new CompletableFuture<>();
    HttpResponseMessage response = null;
    try {
      getFunctionsExecutor().submit(() -> completableFuture.complete(testInternal(req, sessionID, logger)));
      response = completableFuture.get(4, TimeUnit.SECONDS);
    } catch (InterruptedException | ExecutionException | TimeoutException e) {
      final String msg = String.format("%s:[%s]: %s", getVersionPrefixWithSessionID(sessionID),
          CREATE_TEST_MY_SESSION_METHOD_NAME, e.getMessage());
      response = req //
          .createResponseBuilder(HttpStatus.SERVICE_UNAVAILABLE) //
          .header(RETRY_AFTER, TWO_MINUTES) //
          .header(HttpHeaders.CONTENT_TYPE, TEXT_PLAIN) //
          .body(msg) //
          .build();
    } catch (final Throwable e) {
      logThrowable(logger, e.getMessage(), e);
      final String msg = UNEXPECTED_EXCEPTION_MESSAGE + sessionID;
      response = req //
          .createResponseBuilder(HttpStatus.SERVICE_UNAVAILABLE) //
          .header(RETRY_AFTER, TWO_MINUTES) //
          .header(HttpHeaders.CONTENT_TYPE, TEXT_PLAIN) //
          .body(msg) //
          .build();
    } finally {
      logExiting(logger, this.getClass(), CREATE_TEST_MY_SESSION_METHOD_NAME);
    }
    return response;
  }

Add samples for Queue binding

Provide Java Function code samples for -

  • Queue trigger

  • Queue output

For reference, here is the corresponding C# code that needs to be converted to Java -

image

image

HttpTrigger with empty methods does not to respond to all methods

Based on the HttpTrigger docs, if the methods are left empty in the HttpTrigger definition, it should respond to all methods.

When not explicitly defined, the HttpTrigger annotation in Java generates an empty methods array in function.json. This means the function doesn't respond to any HTTP methods at all and is inconsistent with the docs and the way the trigger works in other languages.

Provide access to settings through context parameter

I created the following issue on the java worker repo, but this repo seems to be a better fit. Azure/azure-functions-java-worker#281

Currently it is hard to unit test a function that uses an app setting by reading it from the environment variables. It would be much easier if there was a getSetting(settingName) method on the context parameter, as the context parameter can be mocked in unit tests. At runtime the host can continue to read the value from an environment variable via the method in the context parameter, so there wouldn't be any breaking changes to the runtime.

Default connection to "AzureWebJobsStorage"

From @mjiderhamn in microsoft/azure-maven-plugins#269

Plugin name and version

azure-functions-maven-plugin 1.0.0-beta-2

Expected behavior

com.microsoft.azure.functions.annotation.*Trigger.connection has a default value of empty string. Documentation (albeit for C#) says "If you leave connection empty, the Functions runtime uses the default Storage connection string in the app setting that is named AzureWebJobsStorage."

Actual behavior

If connection annotation attribute is not provided, the following error occurs when executing azure-functions:package:

[ERROR] Failed to execute goal com.microsoft.azure:azure-functions-maven-plugin:1.0.0-beta-2:package (package-functions) on project serverless-azure: Storage binding (blob/queue/table) must have non-empty connection. Invalid storage binding found on method: se.jiderhamn.serverless.azure.AzureFunction.transformBlob -> [Help 1]

Bind multiple query values with the same name.

Hi!

In the previous version of the library (azure-functions-java-core) bound query values with the same name resulted in a comma separated string. In the new version I only get the last found.

Example request: https://test.azurewebsites.net/api/test?scopes=1&scopes=2

public static HttpResponseMessage function( final HttpRequestMessage request,  @BindingName( "scopes" ) final String scopes,  final ExecutionContext context ) {

            context.getLogger().info(scopes);
            return request.createResponse( 200, "OK" );

    }

Output:
previous version: "1,2"
new version: "2"

Is there a new syntax to bind multiple query values with the same name?

CosmosDBTrigger is missing important annotations

Per https://docs.microsoft.com/en-us/azure/azure-functions/functions-bindings-cosmosdb-v2 when using a CosmosDBTrigger, there are properties such as "leaseCollectionPrefix" which cannot be set. This particular one is very important when setting multiple trigger functions against the same CosmosDB database and collection.

Repro steps

Provide the steps required to reproduce the problem

Step A
Create a simple Java Azure Function that uses a CosmosDB trigger

Step B
Try to specify "leaseCollectionPrefix" property.

Expected behavior

All the properties as captured in https://docs.microsoft.com/en-us/azure/azure-functions/functions-bindings-cosmosdb-v2 can be specified.

Package version
azure-functions-java-library-1.0.0-beta-5.jar

Links to source
https://github.com/Microsoft/inventory-hub-java-on-azure/blob/master/function-apps/Notify-Inventory/src/main/java/org/inventory/hub/NotifyInventoryUpdate.java#L28

Add support to specify partition key on EventHub output binding

Today for C# functions if you send EventData type to the output binding you can define a system property for partition key which is honored when sending the message. Today the Java output binding appears to only support string or byte [] which give no way to specify partition key

How to bind input JSON and use it as POJO?

For example, the input queue message is JSON :

{  
   "image_path":"1905/10/1234/1234.jpg",
   "watermakr_id":"1234"
}

And I need it automatically serialized as POJO.
So I can access its properties like this :

	@FunctionName("queueMessageTriggerForWatermark")
	public void BlobQueueTriggerForWatermark(
			@QueueTrigger(name = "message", queueName = "image-watermark-queue") String message,
			@BlobInput(name = "file1", dataType = "binary", path = "{**queueTrigger.path**}.jpg") byte[] arInputImage_org) throws Exception {

How to make it possible?

Setup CI

Setup CI that runs EndToEnd tests to validate updates in the azure-functions-library

Add type and direction infos in Annotation class

background issue microsoft/azure-maven-plugins#456

There used to be strong dependency between function maven plugin and function java library, and now we change to use java reflect to get attributes from annotation to reduce the dependency (PR microsoft/azure-maven-plugins#508).
However, we still need to store annotation type and direction infos in our enum class for we can't get them from annotations. So could you please add the infos in annotations of java library, like add an annotation? If so, all needed informations could comming from java library directly.
Besides, could you please update the function.json scheme? Schema could be much helpful in local validation and users could verify and update their code according to schema as well

Add samples for Event Hubs binding

Provide Java Function code samples for -

  • Event Hubs trigger

  • Event Hubs Output

For reference, here is the corresponding C# code that needs to be converted to Java -

image

image

Add samples for Blob binding

Provide Java Function code samples for -

  • Blob trigger

  • Blob input

  • Blob output

For reference, here is the corresponding C# code that needs to be converted to Java -

image

image

image

Standardize Git Branches and Tags

It is very common practice, on GitHub and at almost all Azure/Microsoft repositories to use master branch as the upstream (latest working/released version) as well the default branch of the GitHub repository.

Tags are good to use to point to previous released versions.

For tags, the current tag name pattern is fine, perhaps just dropping 'v'. 1.0.0-beta-5 as long the Maven artifact is released under same version string.

Would be good to standardize it this way.

Add session support for service bus trigger

Need a new property on the service bus trigger as session are supported in versions >= 3.1.0.

isSessionsEnabled

    {
      "type": "serviceBusTrigger",
      "connection": "ServiceBusConnectionString",
      "isSessionsEnabled": true,
      "queueName": "queue",
      "name": "message"
    }

Also we don't need the AccessRights property. That's an appendage from v1 but isn't required in v2

The current logging is not user friendly. Any plan for supporting Slf4j in future?

In my understanding the only current logging option is to get a Logger instance from ExecutionContext. we have to pass this logger instance onto any dependency method incase we want to add more loggings in dependent methods too. Also java.util.Logger is an older way to manage logs in java application which is not user friendly.
Here is a link - (https://stackoverflow.com/questions/11359187/why-not-use-java-util-logging)

slf4j(https://www.slf4j.org/) is a nice framework to manage logs in a user friendly way.Its also supports a better and user friendly string substitution for logs (https://www.slf4j.org/api/org/slf4j/helpers/MessageFormatter.html)

Is there any plan for supporting slf4j in future?

Ideally if Azure java functions can support lombook(@slf4j) as first class citizen , we can annotate function and other classes with @slf4j and rest of the logging is really easy to follow, rather than deriving Logger from ExecutionContext and passing it everywhere.

Let me know your thoughts.

Use HTTP Integer codes as the argument to createResponseBuilder() method NOT enums

I have been told this is the place to raise this (again - apologies for listing in multiple places)
Until 2.0.11933, (almost 1 YEAR), the createResponse() method signature has used an integer HTTP code. We have built functions that rely on returning non-standard (e.g. 460) Http codes for INTERNAL consumption (within out framework) only. The new method (createResponseBuilder) currently requires an enum that represents a "standard" HTTP code. It is painful enough to have to change source code to use a different method (createResponseBuilder rather than createResponse), but we MUST be able to generate Http codes rather than relying on those that MS chooses to define as an enum. While I generally agree with using enums as parameters, in this case since the enum MUST be converted back to integer for HTTP protocols, there is no compelling reason for MS to interject an enum mapping - which also has the side effect of prohibiting custom codes. If you must, put a sanity check for integers below 100 or above 1000, but please do not prohibit custom codes. I note also that anyone else who is seriously using Java Http trigger functions by necessity would have been using integers - so I don't think making the new method signature revert to integers is an issue for current developers - everyone's source code is going to have to change just to accommodate the new method.
Thanks.
TurtleBeach AKA -jack-

@EventHubTrigger cannot deserialize HttpResponseMessage. What is best way to deserialize HttpResponseMessage?

I have a function(@HttpTrigger) which writes HttpResponseMessage to email topic. I am trying to create another function(@EventHubTrigger) to read message from this topic.

@FunctionName("EventHubTrigger-EmailCurated")
    public void run(
            @EventHubTrigger(name = "httpResponseMessage", eventHubName = "email", connection = "AzureEventHubConnection") HttpResponseMessage httpResponseMessage,
            final ExecutionContext context) {
        logger = context.getLogger();
        logger.info("EventHubTrigger-EmailCurated received a httpResponseMessage : " + httpResponseMessage);
}
Stack: java.lang.RuntimeException: Unable to invoke no-args constructor for interface com.microsoft.azure.functions.HttpResponseMessage. Registering an InstanceCreator with Gson for this type may fix this problem.
Caused by: java.lang.UnsupportedOperationException: Interface can't be instantiated! Interface name: com.microsoft.azure.functions.HttpResponseMessage

I understand HttpResponseMessage is an interface. Is there any implementation class available that I can cast to?

Improve javadoc for templates

Trigger:

Blob
Cosmos DB
Event Hubs
Service Bus Queue
Service Bus Topic
Event Grid

Intput Binding

Blob
Table
Cosmos DB

Output Binding

Blob
Table
Cosmos DB
Event Hubs
Service Bus Queue
Service Bus Topic
HTTP

Add samples for Service Bus binding

Provide Java Function code samples for -

  • ServiceBus trigger

  • ServiceBud output

For reference, here is the corresponding C# code that needs to be converted to Java -

image
image

Event Hubs trigger doesn't support batch trigger

From @jeffhollan on June 13, 2018 12:14

The best practice for running functions at high scale is to pull in a batch of messages in a single execution (EventData[]). For javascript this is described in the function.json as:

{
...
"cardinality": "many"
}
  1. The Java attribute should allow for a configurable flag to set cardinality to many
  • The generated function.json should include "cardinality": "many"
  1. The host should be able to send and have serialized EventData[] to the Java worker

@asavaritayal @pragnagopa as FYI

Copied from original issue: Azure/azure-functions-java-worker#122

Adding multiple entries to a storage queue via java azure function app

From the documentation I was expecting this code
` @functionName("TestPush2")
public HttpResponseMessage HttpTriggerJava(
@HttpTrigger(
name = "req",
methods = {HttpMethod.GET, HttpMethod.POST},
authLevel = AuthorizationLevel.ANONYMOUS
) HttpRequestMessage<Optional> request,
@QueueOutput(
name="queue",
queueName="queue-container",
connection="MessageQueue"
) OutputBinding queue,
final ExecutionContext context)
{
context.getLogger().info("Java HTTP trigger processed a request.");

    // Parse query parameter
    String query = request.getQueryParameters().get("name");
    String name = request.getBody().orElse(query);

    if (name == null) {
        return request.createResponseBuilder(HttpStatus.BAD_REQUEST).body("Please pass a name on the query string or in the request body").build();
    } else {
    	queue.setValue("Echo 1");
    	queue.setValue("Echo 2");        	
        return request.createResponseBuilder(HttpStatus.OK).body("Hello, " + name).build();
    }
}

'
to add 2 items to the queue, (like 'ICollector myDestinationQueue' does in C#), however on the last Item is added.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.