Beam Template
Nine starting points. Countless possibilities. Start with one of nine innovative templates and use our live Site Builder to create your personalized look. Make edits or change templates at any time with no coding required. See all examples. Free HTML5 Template by FREEHTML5.CO. Far far away, behind the word mountains, far from the countries Vokalia and Consonantia, there live the blind texts. 40 Professional Storyboard Templates & Examples A storyboard generally refers to a sequence of drawings, typically characterized with some direction and some dialogue, meant to represent the shots planned for a movie or a television production.
Use beams tools to add load-bearing structural elements to building models.
- Structure tabStructure panel (Beam)
- About the Structural Usage of Beams
The Structural Usage property of a beam is typically automatically assigned based on the structural elements that support the beam. However, structural usage can be changed before or after the beam is placed. - About Placing Beams
It is good practice to first add grids and columns before creating beams. - Sketch a Beam
Use the Beam tool to sketch individual instances of beams. - Place Beams with the Grid Tool
Use the Grid tool to select grid lines to place beams automatically between other structural elements such as columns, structural walls, and other beams. - Sketch a Curved Beam
Sketch curved beams in either a plan or elevation view. - Beam Modification
You can align, move, copy, and adjust beams using common element editing tools. - About Beam Handles
Beam handles are also the attachment points of the beam. Beam handles display as small filled circles that indicate where the end of the selected beam is attached to a column or wall. - Display Moment Symbols
Specify a beam to display moment frames or cantilever connection symbols. - Place Multiple Framing Tags and Spot Elevations
Place multiple beam tags, annotations, and spot elevations on selected beams or all beams in the current view and linked models. - Upgrade Steel Beam and Brace Families
Projects created prior to Autodesk Revit 2014 will need a manual upgrade of steel beam families to fully remove legacy parameters and their behaviors. - Beam Instance Properties
Modify beam instance properties to change level offsets, geometry justification, phasing data, and more. - Beam Analysis Properties
Modify the analytical properties of the beam to accommodate structural analysis tools and procedures. - Beam Type Properties
Modify beam type properties to change flange width, web thickness, identity data, and more. - Beam Family Properties
Modify an beam family to define specific behaviors or Identity Data that apply across all types in that family.
Related Concepts
Cloud Dataflow templates use runtime parameters to accept values that are only available during pipeline execution. To customize the execution of a templated pipeline, you can pass these parameters to functions that run within the pipeline (such as a DoFn
).
To create a template from your Apache Beam pipeline, you must modify your pipeline code to support runtime parameters:
- Use
ValueProvider
for all pipeline options that you want to set or use at runtime. - Call I/O methods that accept runtime parameters wherever you want to parameterize your pipeline.
- Use
DoFn
objects that accept runtime parameters.
Then, create and stage your template.
Runtime parameters and the ValueProvider interface
The ValueProvider
interface allows pipelines to accept runtime parameters. Apache Beam provides three types of ValueProvider
objects.
Name | Description |
---|---|
RuntimeValueProvider |
You can use Use |
StaticValueProvider |
Use |
NestedValueProvider |
Use Note: The Apache Beam SDK for Python does not support |
Modifying your code to use runtime parameters
This section walks through how to use ValueProvider
, StaticValueProvider
, and NestedValueProvider
.
Using ValueProvider in your pipeline options
Use ValueProvider
for all pipeline options that you want to set or use at runtime.
For example, the following WordCount
code snippet does not support runtime parameters. The code adds an input file option, creates a pipeline, and reads lines from the input file:
To add runtime parameter support, modify the input file option to use ValueProvider
.
Java: SDK 2.x
Use ValueProvider<String>
instead of String
for the type of the input file option.
Python
Replace add_argument
with add_value_provider_argument
.
Java: SDK 1.x
Use ValueProvider<String>
instead of String
for the type of the input file option.
Using ValueProvider in your functions
To use runtime parameter values in your own functions, update the functions to use ValueProvider
parameters.
The following example contains an integer ValueProvider
option, and a simple function that adds an integer. The function depends on the ValueProvider
integer. During execution, the pipeline applies MySumFn
to every integer in a PCollection
that contains [1, 2, 3]
. If the runtime value is 10, the resulting PCollection
contains [11, 12, 13]
.
Using StaticValueProvider
To provide a static value to your pipeline, use StaticValueProvider
.
This example uses MySumFn
, which is a DoFn
that takes a ValueProvider<Integer>
. If you know the value of the parameter ahead of time, you can use StaticValueProvider
to specify your static value as a ValueProvider
.
Java: SDK 2.x
This code gets the value at pipeline runtime:
Instead, you can use StaticValueProvider
with a static value:
Python
This code gets the value at pipeline runtime:
Instead, you can use StaticValueProvider
with a static value:
Java: SDK 1.x
This code gets the value at pipeline runtime:
Instead, you can use StaticValueProvider
with a static value:
You can also use StaticValueProvider
when you implement an I/O module that supports both regular parameters and runtime parameters. StaticValueProvider
reduces the code duplication from implementing two similar methods.
Java: SDK 2.x
The source code for this example is from Apache Beam's TextIO.java on GitHub.
Sunbeam Template
Python
Bear Template For Kids
In this example, there is a single constructor that accepts both a string
or a ValueProvider
argument. If the argument is a string
, it is converted to a StaticValueProvider
.
Java: SDK 1.x
The source code for this example is from Apache Beam's TextIO.java on GitHub.
Using NestedValueProvider
Note: The Apache Beam SDK for Python does not support NestedValueProvider
.
To compute a value from another ValueProvider
object, use NestedValueProvider
.
NestedValueProvider
takes a ValueProvider
and a SerializableFunction
translator as input. When you call .get()
on a NestedValueProvider
, the translator creates a new value based on the ValueProvider
value. This translation allows you to use a ValueProvider
value to create the final value that you want:
- Example 1: The user provides a file name
file.txt
. The transform prepends the file pathgs://directory_name/
to the file name. Calling.get()
returnsgs://directory_name/file.txt
. - Example 2: The user provides a substring for a BigQuery query, such as a specific date. The transform uses the substring to create the full query. Calling
.get()
returns the full query.
Note:NestedValueProvider
accepts only one value input. You can't use a NestedValueProvider
to combine two different values.
The following code uses NestedValueProvider
to implement the first example: the user provides a file name, and the transform prepends the file path to the file name.
Java: SDK 2.x
Python
The Apache Beam SDK for Python does not support NestedValueProvider
.
Java: SDK 1.x
Metadata
You can extend your templates with additional metadata so that custom parameters are validated when the template executes. If you want to create metadata for your template, you need to:
- Create a JSON-formatted file named
<template-name>_metadata
using the parameters from the table below.Note: Do not name the file you create
<template-name>_metadata.json
. While the file contains JSON, it cannot end in the.json
file extension. - Store the JSON file in Cloud Storage in the same folder as the template.
Note: The template should be stored in
<template-name>
and the metadata should be stored in<template-name>_metadata
.
Metadata parameters
Parameter Key | Required | Description of the value |
---|---|---|
name | Yes | The name of your template. |
description | No | A short paragraph of text describing the templates. |
parameters | No. Defaults to an empty array. | An array of additional parameters that will be used by the template. |
name | Yes | The name of the parameter used in your template. |
label | Yes | A human readable label that will be used in the UI to label the parameter. |
help_text | Yes | A short paragraph of text describing the parameter. |
is_optional | No. Defaults to false. | true if the parameter is required and false if the parameter is optional. |
regexes | No. Defaults to an empty array. | An array of POSIX-egrep regular expressions in string form that will be used to validate the value of the parameter. For example: ['^[a-zA-Z][a-zA-Z0-9]+'] is a single regular expression that validates that the value starts with a letter and then has one or more characters. |
Example metadata file
The Cloud Dataflow service uses the following metadata to validate the WordCount template's custom parameters:
You can download this metadata file from the Cloud Dataflow template directory.
Pipeline I/O and runtime parameters
Java: SDK 2.x
Some I/O connectors contain methods that accept ValueProvider
objects. To determine support for a specific connector and method, see the API reference documentation for the I/O connector. Supported methods have an overload with a ValueProvider
. If a method does not have an overload, the method does not support runtime parameters. The following I/O connectors have at least partial ValueProvider
support:
- File-based IOs:
TextIO
,AvroIO
,FileIO
,TFRecordIO
,XmlIO
BigQueryIO
*BigtableIO
(requires SDK 2.3.0 or later)PubSubIO
SpannerIO
* Note: If you want to run a batch pipelinethat reads from BigQuery, you must use.withTemplateCompatibility()
on all BigQuery reads.
Python
Some I/O connectors contain methods that accept ValueProvider
objects. To determine support for I/O connectors and their methods, see the API reference documentation for the connector. The following I/O connectors accept runtime parameters:
- File-based IOs:
textio
,avroio
,tfrecordio
Java: SDK 1.x
The following table contains the complete list of methods that accept runtime parameters.
I/O | Method |
---|---|
BigQuery* | BigQueryIO.Read.from()* |
BigQueryIO.Read.fromQuery()* | |
BigQueryIO.Write.to()* | |
BigQueryIO.Write.withSchema()* | |
Cloud Pub/Sub | PubsubIO.Read.subscription() |
PubsubIO.Read.topic() | |
PubsubIO.Write.topic() | |
TextIO | TextIO.Read.from() |
TextIO.Write.to() |
* You can only execute BigQuery batch pipeline templates one time, as the BigQuery job ID is set at template creation time.
Creating and staging templates
After you write your pipeline, you must create and stage your template file. Use the command for your SDK version.
Note: After you create and stage a template, thestaging location contains additional files that are necessary to execute yourtemplate. If you delete the staging location, template execution will fail.
Java: SDK 2.x
This Maven command creates and stages a template at the Cloud Storage location specified with --templateLocation
.

Replace
YOUR_PROJECT_ID
with your project ID.Replace
YOUR_BUCKET_NAME
with the name of your Cloud Storage bucket.Replace
YOUR_TEMPLATE_NAME
with the name of your template.Replace
com.example.myclass
with your Java class.Verify that the
templateLocation
path is correct.
Beam Template
Python
This Python command creates and stages a template at the Cloud Storage location specified with --template_location
. Make the following changes to the command:
Replace
YOUR_PROJECT_ID
with your project ID.Replace
YOUR_BUCKET_NAME
with the name of your Cloud Storage bucket.Replace
YOUR_TEMPLATE_NAME
with the name of your template.Replace
examples.mymodule
with your Python module.Verify that the
template_location
path is correct.
Java: SDK 1.x
This Maven command creates and stages a template at the Cloud Storage location specified with --dataflowJobFile
.
Replace
YOUR_PROJECT_ID
with your project ID.Replace
YOUR_BUCKET_NAME
with the name of your Cloud Storage bucket.Replace
YOUR_TEMPLATE_NAME
with the name of your template.Replace
com.example.myclass
with your Java class.Verify that the
dataflowJobFile
path is correct.
After you create and stage your template, your next step is to execute the template.