Quantcast
Channel: Blender Developers Blog
Viewing all 177 articles
Browse latest View live

Blender 2.8 Viewport Development

$
0
0
pbr

Introduction

Blender 2.8 original design brought to the conversation a workflow-based usability mantra. Within those constraints we went over the ideal pipeline for a few well defined workflows.

Here is an excerpt from a recent viewport development design document which is still valid for this proposal, and summarizes the principles here presented:

“There’s a fundamental difference between an object’s material appearance and the way that object is displayed in the 3D view. Materials are part of your data, just as geometry is. One way to display is full material preview. When you want a high quality rendering of a lit scene, or you are working on the materials themselves, this is the best choice. There are other ways to view your data, useful in different situations or stages in your workflow.

Answer the question: what about your object do you need to see right now? Maybe the polygonal structure of your geometry is important — especially true while modeling! In this case you want to see the faces and edges clearly. Lighting and materials on the surface are less important and more likely a distraction. Sometimes the shape or form of an object is most important — sculpting is the best example. Individual polygon faces are less important, edges and materials are distracting. Or draw in solid black to discern your object or character’s silhouette” (merwin, June 2016)

To align the discussions with the artists and the technology we are bringing up for Blender 2.8 we tried to introduce a few new concepts: tasks (individual components of a workflow), plates (rendering techniques and interactions within a viewport) and minimum viable products (a few main early deliverables of the development cycle). These concepts are each explained in their own section.

Out of those discussions with a small group of artists and developers, we came up with the present design for the viewport. The ideas here presented will evolve during the project and once they start to be faced against real-world implementation challenges.

pbr

Three pillars of this project

Performance and Responsiveness

  • More responsive navigation (fast drawing)
  • More responsive to user actions (low latency editing, updates, selection)
  • Handle more complex objects / scenes with ease
  • Work on a wide range of hardware
  • Be power efficient (update screen only when needed)

Task appropriate shaders

  • Higher quality visuals (real-time PBR, rich material preview, solid outlines, shadows)
  • Different shading modes for different objects
  • Improve usability (visual cues, showing selection, task-centric drawing techniques)

Integration with external engines

The new viewport will allow for more accurate preview of the external engine your project is targeting. Each engine may have different material and shading models, as well as unique screen effects. Whether you’re making assets for a movie or for a game, the idea is to see what it’s going to look like while you work on it.

  • Offline renderers: Cycles (i), Luxrender (ii), Renderman (ii)
  • Real-time renderers: Blender Internal (iii), p3d (ii), Sketchfab (ii)
  • Game engines: Unreal (ii), Unity (ii), CryEngine (ii), Blend4Web (ii), Armory (ii), Godot (ii)

(i) For Cycles support we will use the PBR implementation docs by Clément Foucault as a reference [link].

(ii) Proper export of assets (with materials) to your engine of choice is not strictly part of the viewport project, and neither is the specific support of its shaders. The project will support Python addons so other developers can work on those integrations.

(iii) We intend to retire Blender Internal as a standalone offline renderer, and make the real-time viewport take over its duties. We plan to support its material system and convert existing Blender files relying on the old engine to their OpenGL equivalent shaders seamlessly whenever possible.

Tasks

Task is an individual step that is part of a larger workflow. For example, the texture painting workflow can be split in a few atomic tasks, such as painting a texture chanel, checking the result combined with the other maps, preview a final render.

Each task has also a mode associated to it (e.g., sculpting, painting, editing, …). However the same mode (e.g., mesh edit) may be required for different workflows and tasks (e.g., retopology and asset modelling).

Even though we want to support customization, we will build a pipeline based on artists’ feedback with well defined presets for each task. The list of to-be supported tasks can be fleshed out during the upcoming usability summit.

That being said, a few workflows will be elected for the initial deliverables, and are listed here as the initial minimum viable products.

Switching between tasks should be as simple and fast as picking one from a pie menu.

Plates

Different tasks require different visuals. For example, for layout you want to see the scene lighting, depth of field, shadows and reflections in real-time. For asset modeling you may need a basic studio light setup, polygon edges, special edges such as sharp or UV mapping seams.

Every one of those individual components (depth of field, studio light, front wires, …) is called a plate. They are composed together in a stack or processed in parallel depending on each plate. If you want the edit object to show in a fully lit (PBR) shader, that’s a plate. If on top of this we have a highlight outline for the edit object, that’s another plate. If we have all the other objects in a simplified representation (no subsurf) with a clay shade, this is another plate. If we have depth of field on the viewport, that’s yet another plate.

A key point is that plates are drawn independent of each other. Manipulator widgets or selection outlines don’t need to know whether they’re being drawn atop a simple solid shaded scene versus a full Cycles rendered scene. Separating all our different drawing techniques into independent plates is going to allow mixing & matching & combining of techniques to assist the task at hand. On the developer side, this gives us markedly cleaner and easier-to-maintain drawing code. New visuals can be implemented with little worry about interfering with how other plates are drawn.

An extensive list of plates can be found on the Appendix I.

Minimum viable products

The viewport project is structured to gather continuous feedback from artists during its development. In order to balance working in the code infrastructure and showing useful results for artists to jump in right away, we defined a milestone per project pillar.

Each milestone represents the initial code to consolidate a pillar’s full implementation, as well as a fully functional artistic working experience. Those specific milestones will be treated as MVPs—minimum viable products—and should illustrate the benefits of the viewport project for a certain workflow.

The MVPs were chosen based on their gradual implementation complexity, their clear correlation with the proposed pillars, as well as their relevance in an animation studio pipeline.

  1. Character modelling
  2. Camera and staging
  3. Texture painting

1. Character modeling

“Performance and responsiveness”

Deliverables:

  • Huge performance gain
  • Beautiful editing mode
  • Groundbreak to illustrate how wireframes are a poor choice for some (if not all) tasks

Plates:

mvp_1

  • Optional plates like Grease Pencil are shown here with a dotted outline.
  • Tool UI (snap, selected v/e/f, …), proportional editing
  • Object being edited will be drawn in a way that stands out visually and informs the modeling task. Shown here in basic clay for early deliverables.
  • Objects not being edited are shown for spatial context but drawn in a way that diminishes focus on them.
  • Optional reference image (blueprint, face profile, etc.)

2. Camera and staging

“Task appropriate shaders”

Deliverables:

  • Reliable and controllable lights
  • Advanced screen effects (DoF)
  • Flat reflection
  • High quality playblast previews

Plates:

mvp_2

  • Light elements (widgets to adjust scene lights)
  • Probe elements (to setup cubemap reflections)
  • In this example objects are drawn solid (per materials flat color) with scene lights.

3. Texture painting

“Integration with external engines”

Deliverables:

  • Fantastic good looking shaders
  • Smooth pipeline (no need for switching between Cycles/BI to get the right “look” or options in the viewport)

Plates:

mvp_3

  • Tool UI includes painting cursor, brush preview, etc. If preview lighting is used, an extra plate with grabbable light controls will also be shown.
  • Edit Object plate will be one of the variations shown. You’ll be able to flip between them as needed.

Pending designs

We plan to to revisit the overall design every time we get to one of the MVPs. For instance, there is no mention of how the customization of the tasks will be, as well as the user interfaces.

We also need to explore what the other workflows may be, what tasks they consist of, and what technology and techniques we need in the viewport to fully support them.

The idea is to validate this proposal with the deliverables, and adapt it accordingly.

Meet the team

The development will be open for the usual Blender contributors, artists and developers. We will have an open call for developers to assist with specific initiatives here and there, and will welcome anyone willing to work together towards the 2.8 viewport goals.

The current team, their IRC nicknames and their roles in the process is:

Mike Erwin (merwin) is responsible for the low level implementation. This has started already with the OpenGL upgrade, Vulkan preparations, and internal API design.

Dalai Felinto (dfelinto) will help coordinate the work from the developers and artists, and take over part of the core development.

Clément Foucault (hypersomniac) will take over much of the PBR implementations. Part of his PBR branch will be ported over as a Cycles alternative for realtime previews.

Sergey Sharybin (hackerman) will handle all the scene data that will be fed into the pipeline. This will happen in parallel to the new Blender depsgraph.

Blender Studio artists that will be providing feedback and help to validate the design.

Appendix I – Plates

Following are some of the individual plates we anticipate being required. Additional plates can be implemented once the workflows and tasks are better defined in the future.

Effects

  • Depth of field
  • Reflection
  • Color adjustment
  • Fluid simulation
  • Particles
  • Lens flare
  • Custom (GLSL)

Elements

(aka Objects not renderable)

  • Cameras
  • Lamps
  • Probes
  • Speakers
  • Forces
  • Motion path

Shading

  • Clay (solid + AO)
  • Outline highlight
  • Solid wireframe overlays
  • Depth wires
  • Solid color with scene lights
  • Grayscale with scene lights
  • Real-time (GLSL/PBR) with scene lights
  • Real-time (GLSL/PBR) with HDRI
  • Matcaps
  • Raytracer lit (e.g., Cycles)
  • Studio light
  • Silhouette (uniform color / solid black)

Filter

  • All objects
  • Edit object
  • Object types (Mesh, Armatures, Curves, Surfaces, Text …)
  • Mouse over object (?)
  • Mouse over mesh hot spot object (? – mesh manipulator)
  • Outline search result (?)

Misc

  • Simplified geometry
  • No hair

Appendix II – Buffers

Each plate reads specific data from the scene graph and from other plates’ buffers, and writes into its own buffers. The buffers for a typical 3D View will consist of:

  • Scene depth (32-bit)
  • UI depth for 3D widgets
  • HDR color (16f RGBA by default, 32f option in system prefs)
  • LDR color (8-bit RGBA)
  • Object ID for selecting/masking (32-bit integer)
  • Composited screen color (10-bit RGB)
  • Previous HDR color for SS reflections

These details are mostly invisible to the user, and part of what the viewport API should be able to gather from Blender “datagraph”. To quell panic about runaway VRAM usage: each plate does not necessarily have its own private buffer, these buffers cover only the 3D view not the whole screen, and only the composited screen color must be double buffered.


Asset management and Pipeline

$
0
0
Claude Cloud Asset Engine in action.

Here is a quick report/summary of work and discussions about Data-block handling, Assets management and Pipeline, that happened at the Blender Institute during my four days stay in Amsterdam, in September 2016.

Reconfirmed Principles

Maybe the most important point is the clear separation between low-level tools and APIs to allow data-blocks and assets management, and the high-level and complex setup that is a production pipeline. Blender itself should not enforce any type of pipeline or asset management, it must only provide tools to make them possible. The “Asset Engine” idea springs from this principle, quite similar to the “render engine” system used integrate external renderers into Blender.

The second point is that we want to keep the linking system already existing in Blender as basis for the new features, this ensures us both a degree of backward and forward compatibility, and excellent performances. However, this implies we keep the .blend file format at the core of the system.

Current Status

Claude Cloud Asset Engine in action.

Claude Cloud Asset Engine in action.

Past year has been mostly spent on fixing and enhancing how Blender handles its data-blocks. Most of this work as been merged in master now, and is part of 2.78 release.

The asset engine system/API is also mostly finalized, and has two testing/demo engines now:

  • Amber, designed as a simple local asset repository (asset meta-data being stored in a JSON file along with library .blend files).
  • Claude, mimicking the Cloud Add-on, to allow browsing and importing textures from the Blender Cloud on-line repository.

Roadmap and TODO’s

Main goal in coming months is to finalize Asset Engine API, and as a side track smaller goal, enhance our ‘pose library’ feature.

Asset Engine API is now for the most part well defined, and (limited) tests have proved it to be working so far. What remains to do here is mostly implement last bits of it, and a lot of UI/UX design and code work (currently it does not expose much more than basic linking to users, access to things like variants and revisions are not yet possible e.g.).

The Sub-Data Issue

One big topic that is still somewhat open is how to handle non-data-blocks data (good example: texture image files). So far, asset-engine branch handles them by:

  • Defining a ‘virtual library’ data-block (one per asset engine repository).
  • When loading image file, it creates a ‘faked’ linked Image data-block, using relevant virtual library.

This is how things should work in a more generic way, but it poses a problem: linked data-blocks cannot be edited at all, which, in case of image files e.g., means you cannot edit any setting of the Image data-block, and have to go with the default values.

Current asset-engine branch works around this with a hack – it detects those faked-linked data-blocks and allows their edition. Further more, it also fully saves them in .blend file (instead of only saving a ‘reference’ as usually done for linked data).

However, this is not an acceptable solution for final code, here we’ll need the ‘override’ feature, which would allow us to define a purely linked Image data-block (still using the virtual library), and then add a local overriding Image data-block that would allow edition of everything but the image file path.

Pose Library Enhancement

As a smaller, more close-goal project, we want to enhance Pose Library system in Blender, with both some general UI/UX improvement, and by adding a preview system for each pose.

This can also be used as some occasion to investigate and experiment things that should be used more widely in final Asset Management feature (that is, easy and friendly access to data, with icons, drag’n’drop, etc.).

More detailed plan can be found about this project on Sybren’s page.

The Collada Case

$
0
0
collada_final

This is a story about Collada, its current status and what we possibly can do in the future.

Where we are

The current implementation of the Blender Collada module is based on the OpenCollada C++ Library. OpenCollada’s goal is to fully implement the Collada standard. This library has been developed slowly over the years with a significant drop of activities in the years 2011-2014. Since then it slowly gained speed again, see the OpenCollada Contributor Chart.

The initial reason for why the OpenCollada library was chosen for Blender is somewhat shady, but the 3 main reasons seem to be:

  1. It was available when Blender 2.5 was born
  2. It provided an Import part and an Export part “out of the box”
  3. There was someone who got it to work with Blender in a GSOC project

How it is implemented

Blender normally uses Python based Addons for implementing Data Importers and Exporters. But the OpenCollada Library is a C++ Library and it does not come with a Python API. Because of that the Collada Module is also written in C++ and it is fully integrated into the Blender main Program itself.

So what we call the Collada Importer/Exporter is a set of C++ Classes sitting in between the Blender Program and the OpenCollada Library, mediating data between the 2.

What’s wrong with it?

There are a couple of downsides with the Collada Module as we have it today. First thing is that on the Blender side many very basic functions are used which make it hard to follow what is actually happening and why things have been done as they are done. There is about zero documentation about why parts of the code are implemented as they are (in short form: Odd stuff )

oddities

On the other side the OpenCollada library seems to be used in some weird way. I am still not sure why the Blender interface is doing so much on its own, why the imported files are always read twice, why so many things are done in the module where i suspect the OpenCollada library has it already solved for us.

So  here is the list:

  • The Module must be maintained by C++ People
  • The Maintenance is awkward due to self made Code Complexity (too much c++ inside)
  • The Library adds about 35 MByte to the Blender Distribution
  • The Importer and the Exporter have never been fully developed

What we can do about it

I see 3 basic options how we could possibly improve:

Full Rewrite

After fiddling with this module for about 3 years i finally feel confident enough to try a full rewrite. The idea is to reorganize the module such that it becomes easier to maintain and easier to improve. And at the same time make it easier to replace the current OpenCollada Library by something new (also based on C and C++) for example for a …

Hybrid Approach

We make use of the rather complete feature set of the OpenCollada Library (or a more recent replacement if such exists) on one side, but also add a Python API to it so that we get the advantages of having a Python Addon which can possibly be maintained by more people.

Full Replace

We can remove the Module and create a full replacement in Python. In fact for the Exporter side there are already a couple of alternatives available. However as far as i can tell none of the python based alternatives seems capable to fully replace what we have by now.

For the Importer side all i can tell so far is: This is where most of the trouble seems to be located. Collada is complex and very flexible. Trying to create a full featured importer based on Python might be a huge task.

What we will do about it

modules-2In the very first place we will attempt to simplify what we have and reorganize the library with the Hybrid approach in mind but without actually implementing any Python API. I think the best we can do here is to add 2 organizational layers between the importer/exporter and their counterparts (Blender and the Library).

  • IO – Shield is the only passway that we will allow to get stuff out of Blender or into Blender. This can be a very thin API (if needed at all) based on the Blender RNA module.
  • Compatibility Cloack is the only passway that we will allow to get stuff from Collada or to export stuff to Collada.


The cool thing about this approach is that we can put all Blender dependencies into the IO-Shield while we can put library dependent stuff into the compatibility Cloak. When we later decide to step out of the C world then we can convert the IO Shield into a Python API and then go crazy with our Python Addon.

Actually the IO-Shield might even serve as generic API for Importers and Exporters as the general questions about how to feed data to Blender or pull data from Blender should remain more or less the same for all formats… no?

What we could need from you

  • In first place we ask for your opinion about this approach. Is it feasible, can it be improved, are there alternative approaches worthwhile to look at?
  • But we also want to know in which ways Collada is actually used. All i know so far is that those who use it make heavy use of it.
  • I also know there are other groups out there in the wilderness of the internet who can not live with our Current implementation for one or the other reason. It would be helpful to get detailed information in which way the current implementation does not work (including demos how it fails).

Why i am doing this?

I think that Collada is an open standard and Blender should support this standard in the best way it can be done. I also believe that we can get most out of this by embracing existing stuff like OpenCollada or possibly other existing libraries (c or python) and help to improve what exists and not to get into “This stuff is odd, i can do better then that, take mine its awesome” attitudes.

And besides all of the above i just want to learn how its done. If anybody feels like they can actually take the lead here please step in. I would be more than happy with that. Until this happens i will do my best to keep this going and make it a successful learning experience 🙂

I will add my activities to this article as time goes by. But now its time for opening the editor and start doing.

Layers and Beyond

$
0
0
blender_2-8_scene_layer_data

Disclaimer

This document was built on top of the development and research lead by Julian Eisel (Severin) as part of the Google Summer of Code 2016. This proposal is a usability amendment to his design with the viewport project in sight. See the original documentation [here].

layers

Introduction

The current layer system in Blender is showing its age. It’s been proved effective for simple scene organizations, but it lacks flexibility when organizing shots for more convoluted productions.

The layer system for Blender 2.8 will integrate workflow and drawing requirements. It will allow for organizing the scene for a specific task, and control how different set of objects are displayed.

The following proposal and design document tries to address the above by unifying the object and render layers and integrating them with the viewports and edit modes.

For the rest of the document, whenever we refer to a layer we are talking about the new layer system that replaces all previous layer implementations in Blender. Since in the future we will have other kind of layers (e.g., animation) the layer here referred will be renamed soon (view layer? render layer? workflow layer?).

Data Design

Here is a proposal for how to separate data, and integrate the layer concept with scene data. In 2.8 we will split the scene in layers, and each layer will have a mode and an engine to use for drawing. Each viewport can show a different layer.

Data Separation

DNA data (that what gets saved and what we work on), Scene data (based on animation or scripts), and Render Data (engines) should have an explicit separation. In 2.8 we will split viewport rendering into many smaller ‘engines’.

blender_2-8_data_flow

(Data diagram revised from original data design by Sergey Sharybin and Ton Roosendaal)

Scene and Layer

Each layer will have its own active object and selected objects. We drop the scene base, and guarantee there is always a layer in the scene (as we do for Render Layer now). A scene also has an active layer, which is the one used to determine the “global” mode for the other editors.

blender_2-8_scene_layer_data

Render or Draw Pipeline

  1. Active Layer
  2. Collections define visibility, overrides, light!
  3. Update layer list (or make copy)
  4. Send to depsgraph (or use cached)
  5. Send scene data to engine
  6. Draw per-mode specific tools
  7. Draw or render

1 render layer = 1 image (multi-view, passes)

Layers

A layer is a set of collections of objects (and their drawing options) required for specific tasks. You can have multiple layers in the scene for compositing effects, or simply to control the visibility and drawing settings of objects in different ways (aiming at different tasks).

A layer also has its own set of active and selected objects, as well as render settings. Few render settings are kept at scene such as dimensions, views and file format.

A scene can have multiple layers, but only one active at a time. The active layer will dictate which data the editors will show by default, as well as the active mode.

Collections

Collections are part of a layer and can be used for organizing files, fine tune control of components drawing and eventually overwrite support of linked and local data. They are group of objects defined individually or via name and other filtering options.

The drawing settings relative to a collection will be defined in the layer level, with a few per-collection engine specific options (matcap, silhouette, wire).

A collection has the following data: name, visible, selectable, engine settings per collection, overrides, element list.

Mode

As part of the workflow design a layer will be created with a task in mind, and part of this task is the mode used at a given time. The mode will be an explicit property of the layer.

The available modes at a time are depending on the active object. All selected objects (when possible) will enter in the same edit mode. That means an animator will be able to pose multiple characters at once. And the mode is no longer stored in the object.

Viewports

A viewport should be able to select a single layer to display at a time. However the viewport stores the least amount of data. A viewport doesn’t store the current mode, the current shading, nothing. Apart from the bare minimum (e.g., current view matrix) all the data is stored in the layer.

That also means we can have multiple viewports in the same screen, each one showing different objects, or even the same objects but with different drawing settings.

Since the settings are defined at a scene level (scene > layer > collection), and not at viewport, multiple viewports can share the same drawing settings and objects visibility. The same is valid for offscreen drawing, scene sequence strips, vr rendering, …

Viewport and Render API

This is still needs to be discussed further, but the initial idea is to re-define the Render API so that an external engine would define two rendering functions:

  • RenderFramebuffer
  • RenderToView

Where the latter would be focused on speed, and the former in quality. That creates a separation in the original idea of making the “Cycles-PBR”, or “Cycles-Armory shaders” as part of the Cycles implementation.

That also means we can (and will) have a new Render Engine only for nice good PBR OpenGL viewport rendering. The pending points are:

  • How do we handle the external engine drawing
  • Shader design (face, lines, points, transparency)

Render Engine View Drawing

There are two main options to implement this:

  • Blender draws the objects using the render engine defined GLSL nodes, and some screen effects, handling probes, AO, …
  • The render engine is responsible for going through the visible objects, binging their shaders, drawing them, …

Either way we will have the engine responsible for drawing the solid plates [see docs]. This then is combined with tools drawing, outlines, …

Engine such as Cycles really need to be in control of the entire drawing of its objects. However for OpenGL based engines, we could facilitate their implementations.

OpenGL Rendering Engine

A new render engine should be able to close the gap between fast viewport drawing and realism. This engine could take care of PBR materials that deliver a Cycles like quality, without having to mess with Cycles specific shade groups.

In fact this engine should have an option as to the backend used for the views. The user should be able to pick between Cycles, PBR, Unreal, Unity, … and have the texture maps corresponding to the specific engine.

Cycles itself can still deliver a RenderToView routine that is faster than what it is now (with forced simplification, …). That said, it’s not up to Cycles to implement GLSL nodes that are similar to it BSDF implementations.

Mode Engines

We will have specific engines aimed at specific edit modes. Those engines can be used for drawing a layer exclusively, or combined with the layer render engine. For example, when using the Cycles render engine in object mode, we will have the object engine drawing nice highlights on the selected objects, as well as object centers and relationship lines.

However the object mode engine will also be accessible as a standalone engine option, bringing back a full solid drawing for the entire layer, with its own options. This also means we are dropping the current draw modes, and in particular the per-object maximum draw.

image2

Discussion Topics

Topics for the usability workshop in November 2016.

How do we edit multiple objects?

Should we support mesh edit? Meta-balls? Maybe support for some modes only.

Grease Pencil

Since Grease Pencil is undergoing some changes, we should make sure it integrates well with this proposal

Linking

Artists will want to re-use a carefully crafted layer/collection structure. Linking wouldn’t solve that because usually they wouldn’t want the objects to be linked in as well. We could have a save/load system for layer/collections. Maybe it’s something simply doable with Python, similar to how we save/load keymaps.

Scene Override

An animator typical workflow is to have a viewport for the camera view and another one for posing. Changing settings such as simplify would now mean we need to change this setting in different layers (or collections). Or to have other two other layers with different “simplify” settings and switch between them both respectively.

New Objects

New objects in Blender are automatically added into the active layer and active collection. But should we have a way to also add them into the non-active layers and their respective collections?

Acknowledgements

Original layer management design by Julian Eisel and Ton Roosendaal and all the developers and artists who provided extensive review on [T38384].

Current design elaborated with Ton Roosendaal and feedback from Sergey Sharybin. Feedback incorporated from Blender Studio crew.

Blender 2.8 project status

$
0
0
screen-shot-2016-11-20-at-15-12-10

The main topic of my Blender Conference 2016 keynote was the Blender 2.8 project; where are are and what to expect from it in the coming year. The biggest news is that we are really going to start working on it, with more developers than ever – especially thanks to the support we get from the industry.

The main sponsors currently are (in parentheses amount of full timers):

  • Blender Foundation (2+)
    Thanks to donations and Development Fund we can keep supporting developers to handle daily tasks – especially for bug tracker reports and patch reviews. For the coming period we can keep supporting two (near) full time positions on this job. Aim is to involve the active volunteers first, with them discussions are ongoing or will start soon.
  • Blender Institute, Netherlands (3+)
    Thanks to Blender Cloud, subsidies and some sponsoring the Blender Institute can hire a team of 12 people already; which includes myself (general coordination), Francesco Siddi (pipeline and web development),  Sergey Sharybin (dependency graph, Cycles), Sybren Stuvel (web tech, pipeline), Pablo Vazquez (web design, UI team).
  • Nimble Collective, USA (1)
    This Mountain View startup is investing a lot in Blender, via own channels and by supporting a half year developer seat in Blender Institute.
  • Tangent Animation, Canada (2+)
    This Toronto+Winnipeg based animation studio recently completed their first 100% Blender made feature film – Run Ozzy Run. They currently work on their second feature, a scifi story with robots. Tangent hired own developers to work on Blender and Cycles, and supports 1 year of two full time developers in Blender Institute to work on Blender Viewport.
  • AMD, USA (2+)
    AMD already hired Mike Erwin to work on upgrading Blender’s OpenGL, they recently hired a small team to add ProRender (open source OpenCL optimized engine) in Blender, and now support one Blender Institute developer to work on Cycles for OpenCL GPUs for 1 year.
  • Aleph Objects, USA (2+)
    The makers of the popular LulzBot 3D printer accepted a proposal from Blender Institute to fund the “Blender 101” project – which is a release-compatible version of Blender configured for kids (education) or for occasional users. This is closely related to the goal of the Blender 2.8 “Workflow” project anyway – we want users to more efficiently make (or use, or share) optimally configured Blenders for their own workflows. This would support at least two full time developers on “101” and “Workflow” for a year.

The result is we can roughly assign 10 full time developers on Blender 2.8 in the coming period!

What it means is the following:

  • At leat three full timers will work on the Viewport, including PBR, new shader editing, layers, compositing, etc. Which means we totally get something in the course of 2017!
  • Workflow and usability will get a good amount of attention. We still have to be realistic with the goals but we for sure can make another big step forward to finish 2.5 goals and make sure we’re ready for future development.
  • Work on Dependency graph (animation system, duplication, overrides) and asset management will move on as well, including Alembic caching. We might start a flirt with USD?
  • And of course we keep our flagship render engine Cycles up to date for all platforms!
  • Blender 2.8x (and branch) will be tested in highly demanding production environments.

What stays open for now:

  • The “everything nodes” concept will have to wait a while. It’s not something for 1 person (or part timer) to do in a short time, especially not now we’re doing so many other projects already. This project also depends on successful completion of “depsgraph”, which will take many months – if not half a year.
  • Better news: the Blender Game Engine project might be getting a revival. The UPBGE team is very motivated to keep Blender engine work, and help with aligning BGE with the goals we set for 2.8 – at least to use the new viewport and pbr shader system. A new logic system is still undefined.
  • Workflow issues for non-open pipelines: we will try to cover some of the topics (UDIM support), but most pipeline-specific issues to integrate Blender well in production pipelines with other (closed) CG programs is still something we’d need active help for – preferably code contributions by the companies who need such features themselves.

Next weekend (25-26-27 November) there’s a 12-person “2.8 Workflow” workshop in Blender Institute with active coders and contributors to Blender. Participants are Jonathan Williamson, Pablo Vazquez, Julian Eisel, Paweł Łyczkowski, Daniel Lara Martinez, Sebastian Koenig, Bastien Montagne. Brecht van Lommel, Mike Pan, Sergey Sharybin, Dalai Felinto and myself. I expect we then can align a lot of ideas and requirements and agree on design decisions that will help everyone to move on for several years! Expect elaborate reports from this team here and on the usual bf- mailinglists!

Highlights of Workflow Workshop

$
0
0
20161125_143936

Blender Institute had 3 days of coder/artist workshops on 2.8 designs and targets. Having 12 brains going over all aspects of usability and technical concepts has been extremely useful. The team is gathering all notes and designs currently and will make extensive reports for everyone here soon. Some docs will appear on the code.blender.org site, most of them go to wiki.blender.org.

I can happily confirm that we agreed on everything we’ll present. It’s well aligned with what we started for 2.5, very good to have clear focus and direction! Of course we look forward to a public review and high level of support by everyone who’s involved with Blender.

While the real detailed docs are getting reviewed and finalized, I’d like to share the highlights with you in this short summary:

Blender 2.5 concepts

  • Design principles of original 2.5 doc were verified and confirmed to be valid still.
  • Suggested is to make more use of temporary popups for settings (viewport settings, etc). That clears up the too long property panel lists.
  • The select-tool order can be kept working by use of new toolbar that will present tools using new widget/manipulators. Goal will be that most operators will get the ‘redo’ operator as widgets in an editor, allowing users to tweak operations more efficient, but also to repeat it on a different selection.  For example translate and rotate, but also spin, duplicate, extrude, knife, etc can work this way.

Layers

  • The Scene Object list will become a hierarchical list of Object Collections. Each Collection can hold other Collections and hold all the Objects you want. Objects can be in multiple Collections.
  • The new Layers then use (= point to) the Scene Collections to define what’s visible, or editable, or edit overrides (draw types, materials, etc). This way you can set up Layers that use the same Collections, but with different visibility and render settings.
  • Those Layers will be used by all editors (including viewports) and render engines and the compositor.
  • Visualization of Object-mode and Edit-modes in Blender will be coded as a special drawing engine, composited on top of viewport drawing/rendering itself.

Workflowworkspaces

  • New configuration files will allow to define “Workspaces”, which will be available as tabs in the top header of a window. Each Workspace can store multiple layouts (old name ‘Screen’), and will allow to filter or change visible panels, tools, menus, active addons and shortcuts.
  • Blender’s screens (layouts) will get a default top bar for menus, for active scene, for active mode and for active layer. Scene, Mode and active Layer will be ‘global’ for all editors in that workspace.
  • Further workflow configuration will be enabled by efficient use of template files (sets of default assets or primitives) and a project file (settings that need to be set per project, such as file paths or web settings).
  • Special attention will go to deliver a couple of low-level input “keymaps” for input devices like mouse, pens, tablets and touchpads. These should be selected based on your hardware setup and preferences (left select, right select). On top of that we make 1 strong (more minimal) default shortcut keymap for all basic operations.

Technical design

  • Each Layer gets own dependency graph
  • Overrides can be static (on start/load) or dynamic (animated).
  • Overrides in two types: to replace values (color, etc), or add/remove items from a list (modifiers, constraints, etc).
  • Particles and hair systems will be put back, but as own primitive type. Full proposal coming soon.

Participants were a very diverse group of coders and artist/contributors:

Pablo Vazquez, Julian Eisel, Paweł Łyczkowski, Daniel Lara, Sebastian Koenig, Bastien Montagne, Brecht van Lommel, Mike Pan, Sergey Sharybin, Dalai Felinto, Ton Roosendaal and Jonathan Williamson remotely.

Ton Roosendaal

1 December 2016

cylv_nexcaax-db

The Blender 101 Project and You!

$
0
0
coverdesign

What and Why?

With the upcoming 2.8, Blender is going to be an amazing tool for artists. However, not everyone is a professional working in a studio environment. Many are casual users. Some use Blender just for 3D printing, some to make models for games, and some just want to teach their young kids the basics of 3D. For these users, Blender’s all-in-one approach poses a huge challenge.

The Blender 101 project is about making Blender usable for everyone. Using features that’s going to be available in 2.8 such as Workspaces, Templates, and some good ol’ coding, we aim to achieve the goal of ‘Blender for every occasion’.

Just like how the new Workspaces in Blender 2.8 will optimize the interface for specific tasks(such as modelling, animation, sculpting and compositing), Templates will go one step further, transforming Blender into very focused applications that do specific things really well. Possible candidates for templates include:

  • Blender Simple
  • 3D Printing
  • Games Creation
  • CAD

The template will be something that the a user can select from the splash screen of Blender:

splash

By selecting a template, every aspect of Blender can be changed, including:

  • User preferences
  • Addons
  • Color schemes
  • Input maps
  • Units
  • Interface layouts
  • Manipulator widgets
  • Pre-built 3D assets and presets

Because Templates could be potentially big in size (since they may contain pre-made assets), they can be made available as separate downloads, and loaded into Blender like an addon. Some templates can be bundled in the default install. This is to be decided later.

Blender Simple

An ideal candidate for template is a massively simplified Blender. This template will strip down the interface to the bare minimum, encouraging inexperienced users to explore a 3D program without worrying about the consequences of making a mistake. The target audience for this are kids under 16 years old, or people who has absolutely no experience with computer graphics. The expectation is that one day, they will be able to ‘graduate’ to the full Blender without having to relearn a new interface.

Blender for 3D Printing

Even as 3D printing is gaining popularity, preparing a model for printing is still a complex and highly technical process. Together with Aleph, we want to make this process as simple as possible. The 3D printing template will have the bounding box and measurement units all setup, ready for you to create or cleanup your model. The template will also have the ability to carry out sanity checks on the model to ensure its printability.

Blender for Games

Blender is an ideal content creation platform for game engines such as Unity and Unreal. However, currently Blender doesn’t have a focused interface for game makers. If we can work to remove functionalities that’s irrelevant to game designers, and provide basic shaders and assets that are compatible with modern game engines, artists will have a much easier time creating content in Blender.

Keep in mind that because these templates are still running the same Blender under the skin, the files they create will be interchangeable. For example, one can always bring a model created in Blender Simple and retopologize it in the full version.

These are just some ideas for what the 101 project could bring. Our goal is not to ‘dumb-down’ the interface and pose artificial restrictions, but rather to optimize the interface for everyone’s individual needs.

After all, as Leo Tolstoy might have once said, “All happy blender users are not alike.”

Please leave your ideas or suggestions as reactions to this article!

Thanks,

Mike Pan

Viewport Project – Plan of Action

$
0
0
Clay Engine

December, the time of the year for hot chocolate mugs, bright decoration lights and, of course, development roadmap. In the spirit of the season, the following document is the plan of action for the viewport project in Blender 2.8. Throughout 2017 we are expecting to tackle:

  • Final design of engines and passes
  • Clay engine
  • Implement remaining edit modes as passes
  • Workbench engine
  • Eevee engine
  • Port Cycles materials to the Eevee engine
  • Implement missing engines

Final design of engines and passes

Let’s start by defining the terms here used, mainly engines and passes.

A pass is the “black box” code responsible for filling a buffer. For example, the “object mode” pass includes:

  • Relationship lines
  • Object center
  • Outline (selected/active)
  • Widgets (?)
  • Non-geometry objects (lamps, empties, …)

An engine then compose multiple passes together. How the compositing happens is up to the engine. Finally the engine delivers the final image to the viewport.

For example, a very basic Clay engine can be described as:

viewport_engines_diagram

The clay engine will take the “Wire Pass”, the “Solid Pass” and the “Object Mode Pass” and combine them together.

At the end of this, the core design will be finalized including a final proposal for a material re-design with corresponding PyNode changes.

Clay engine

Clay Render Engine

Assets courtesy of the Agent 327 – Barbershop scene.

The clay engine will use matcap and allow per-object matcaps (so we can override them per collection). Also it will support depth of field and ambient occlusion.

This engine is a quick way to start drawing objects in the new viewport, as well as to validate the engines compositing design. The Clay engine will initially only support the “Object Mode”. It will be up to the engines to decide which modes will be supported.

Implement remaining edit modes as passes

Bring back all the edit modes (edit mesh, sculpt, …). The edit modes will be brought back ONLY when they are already conforming to the new design (layers, depsgraph api, engines, …).

Workbench engine

A complete engine for modeling. A beautiful white canvas with floor shadows for modelers to work on. It should sell the idea that 2.8 viewport is not only PBR, but about the right visualization for the task at hand.

The workbench engine will support:

  • Auto-generated colors
  • Beautiful diffuse shading
  • Sharp specular highlights
  • Shadows

The engine will have an option to either auto-generate colors per material or per objects. The objects and materials will have override options. This can be overridden by collections.

Eevee engine

To avoid the confusing and too much hyped name “PBR”, we propose to call this engine the EEVEE, or Extra Easy Virtual Environment Engine!

Finally, we will get a nice photo realistic engine. Realtime reflections? Check. Soft shadows? Check. Fast, responsive and eye-candy? Check, check, and check!

clement-foucault-mech5

Mech Prototype by Clément Foucault – made in Blender, rendered in UE4, long-term goal for Eeevee.

  • Uber shaders for PBR materials
  • Support for features current in the PBR branch
  • Backward compatibility (BI conversion – 80% compliance)

Port Cycles materials to the Eevee engine

  • Port Cycles GLSL code to PyNode
  • Make new realtime shaders for Cycles bsdfs nodes

Implement missing engines

Evaluate if we still need a better replacement for wire, solid, … draw modes. If we need it, design and implement new engines accordingly, or as extra passes for the existing engines.


Dependency graph proposal

$
0
0

This post is all about dependency graph design for the Blender 2.8 project.

Brief overview

Before going too much into technical details let’s first look into some top-level design overview about what’s this project is all about, what would that all mean for users and how it’ll affect on everyone working in 2.8 branch.

  • Dependency graph is owned by layer.
  • Dependency graph holds both relations AND evaluation data (aka state of the scene).
  • Dependency graph allows Render Engine to store its specific data in the evaluation context.
  • Dependency graph evaluation data is aware of copy-on-write and data de-duplication.
  • Dependency graph stays flexible to support data compression in the future.

Now let’s look into all the technical details, digging into what benefits each decision brings.

Ownership

Here and in the rest of the document layers will be referencing to the new layers which are being worked on by Dalai currently. Those will have almost nothing to do with the layers existing in Blender so far.

Each layer will have it’s own dependency graph which will contain both relations and scene state. This gives the following benefits:

  • Each layer gets dependency graph which is highly optimized for that particular use.
    There is no objects in the dependency graph which does not affect this particular layer, meaning there’s no way to have any possible overhead on depsgraph reconstruction and evaluation. In practice this would mean that modelling layer where artists adds new environment objects will not trigger all the character’s rigs dependencies update, keeping interaction as responsive as possible.
  • There is a clear separation in evaluation data (also known as scene state), so this is impossible to have threading conflicts between render and viewport threads.
  • This leads to really clear ownership model and makes it clear what is the time life of the evaluation data.

Ownership Diagram

In order to keep memory usage as low as possible, dependency graph and its evaluation context gets freed when layer becomes invisible.

Such approach does not allow to have same layer to be in multiple states in different windows (for example, it’s not possible to have layer at frame 16 in one window and at frame 32 in another window). But it is possible to set up your workflow in a way that does not require this. For example, you can have animation layer with only character you’re currently animating and show it in one window and have another preview layer which includes everything required for the preview (which could include more objects from environment or other character).

As a possible downside (or at least something to be aware of) in such model is that showing two different layers with same objects in different windows will slow down performance (objects would need to be update for both windows). This is probably possible to solve in the future with some smart de-duplication of evaluation.

Evaluation Data

Previously, scene state was fully decoupled from the dependency graph, which had advantages of simplicity. Now we need to store multiple states of the same scene. Dependency graph seems to be the most appropriate place for that.

Proposal is to store actual data in the ID nodes (or other “outer” node types). This will make it simple to map original objects to evaluated ones and vice versa (just to state the obvious: it’ll cover any datablock which dependency graph is keeping track of, including node trees, collections, …).
Evaluated data for objects includes applied:

  • Drivers
  • Animation
  • Overrides
  • Modifiers

All the render engines will only access evaluated objects and will treat them as fully final (no need from render-engine side to worry about overrides i.e.). This will happen via DEG_OBJECT_ITER() macro (with similar semantics to GHASH_ITER).

Tools will work on original scene data, using objects from active layer.

NOTE: Still not fully clear whether we’ll need to keep bmain in depsgraph or not (as some container of all evaluated objects). This is a topic for discussion.

The data flow here would be:

  • Original DNA is feeding to the input of the dependency graph.
  • Dependency graph copies that data, and runs all operations on this copied data.
  • Dependency graph stores this modified (or evaluated) data in the corresponding outer node (ID node).

Evaluation Data Flow

In order to make node-based everything to work we’ll need to extend this idea deeper and allow operation nodes to have copies of the original data as well. This could be nicely hidden behind the API calls to make such specific to be fully transparent for all related code such as modifiers.

NOTE: We would need to have some API to get final state of the object for modifiers anyway, so extending the scope where the data is stored is not that much of an affect on API.

Data stored in operation nodes gets freed once all users of that data are evaluated, which will ensure lowest possible memory footprint. This is possible for as long as we don’t allow other objects to be dependent on intermediate evaluation result (otherwise we wouldn’t be able to re-evaluate other objects after we dropped all intermediate data).

Interaction with Render Engine

Just to re-cap things which were mentioned above:

  • Render engine only deals with final evaluated objects
  • Render engine does not worry about overrides and consider that all data blocks it interacts with has overrides applied on them.

One thing which was not covered here yet is the requirement to store some renderer-specific data in objects. Example of such data could be VBOs for OpenGL renderer.

Easiest way to deal with this seems to be to allow render engines to store their specific data in the evaluated datablocks. This will grant persistency of this data across redraws for as long as objects do no change. Once object changes and gets re-evaluated it’s renderer-specific data gets freed and recreated by the render engine later on.

If needed, we can go more granular here and allow renderers to have per-evaluation-channel storage, so moving object around will not invalidate it’s VBOs.

Copy-on-write and de-duplication

The goal here is to minimize memory footprint of dependency graph storage by duplicating only data which is getting changed. Roughly speaking, when dependency graph creates local copy of the datablock for the evaluation data storage it only duplicates datablock, but keeps all CustomData referencing the original object’s CustomData. If some modifier changes any CustomData layer it gets decoupled from the original storage and being re-allocated. This is actually quite close to how CustomData currently works in the DerivedMesh.

More tricky scenario here is data de-duplication across multiple dependency graphs, which will help keeping memory usage low when multiple layers are visible and are on the same state.
Proposal here is to go the following route:

  • Have a global storage of evaluated objects, where dependency graph stores evaluation result of all outer nodes (objects, node trees, …)
  • Objects in this storage are reference-counted so we know when we can remove data from the storage.
  • Objects from this storage gets removed when they are tagged for update.
  • Dependency graph evaluation checks whether object at a given state exists in that storage and if so, references to it instead of doing full object evaluation.

This sounds a bit complicated, but it has the following advantages:

  • Data is still de-duplicated even when two visible layers shares some of the objects between each other .
  • Allows us to know exact time-life of evaluated objects.

Data compression

There are now multiple techniques to do run-time data compression, some of which are used by render engines. However, this is a bit too much to worry from the beginning and this should be well hidden behind API already. Generally speaking, nobody outside of dependency graph module should be even aware of something being compressed, it is all up to dependency graph to deliver final unpacked datablock when it is requested. For example, DEG_OBJECT_ITER() will unpack objects one by one and provide fully unpacked self-containing structure to the caller.

Overrides

Overrides are supposed to be used to do simple tweaks to the data, such as “replace material” or “transform this object by given transform”. They are not supposed to be used for anything more sophisticated, such as topology changes (so for that it’s expected to simple make object local).

There are two types of overrides:

  • Static overrides, which are applied by the reading code.
  • Dynamic overrides which are applied by the dependency graph.

Overrides only appliable on top of existing datablocks. This means, you can not define override of some duplicated object (for example, you can’t apply override on a paritcular instance of dupli-group).

Static overrides

The idea of static overrides is to basically generalize idea of the existing proxy system which is currently only supports armatures. This overrides basically works like this: reading code creates a local copy of the linked data. If there’s already such a local copy existing, it’ll synchronize all the changes made in the library to the local copy.

This overrides ONLY appliable on the linkable data (linkable across .blend files). This means, those have nothing to do with things like Collections.

Dynamic overrides

Dependency graph is taking care about dynamic overrides. From the implementation point of view, it is an operation node which applies all overrides to a copied version of the datablock prior to anything else evaluated for that datablock. Since dependency graph is taking care of copying datablock already (as was mentioned above) then override operation node simply applies all the overrides on the object from within current context (read as: override operation does not care about copying object).

Here’s a fragment of dependency graph showing nodes setup:

Example of overrides nodes

The following dynamic overrides are supported:

  • Replace value
    Example: change material color
  • Add to the list
    Example: Add new modifier to the object
  • Remove item from list
    Example: Remove constraints from the object

This overrides also applies to the Collections. This is because collections are not linkable across files at all, so static overrides makes no sense here.

Caches

Now when lower-level dependency graph topics are somewhat covered, let’s look into a bigger picture of caches and streaming data.
The requirement here is to be able to have a pipeline in which artists can bake the whole chunk of their scene and pass it to the next department. For example, when animators finish animating their 10 characters in the scene, they bake all of them and pass the cache to the lighting/rendering department. This way rendering artists can have real-time playback of the whole scene. Being realtime is always a good thing!

Other requirement here is to be able to easily investigate and verify how cache was done, where it came from, whether it’s up-to-date or not. This could be done automatically and/or manually.
One of the first questions is what is being “baked” to cache and what is still coming from a linked scene/asset. Surely, for the maximum performance everything should be baked (including meshes, materials, …) but that makes it really difficult to modify anything after the cache was done.

Currently we consider the following:

  • Final department might consider baking everything (to pass to render engine, or other software in the pipeline)
  • Usually artists are baking everything apart from materials.

As an implementation of cache we’ll rely on Alembic. It has nice time compression, flexible object-based inner storage design and supports (to certain extend) storage of non-geometry data which we can use for storing meta-data of the cache.

Just to state the obvious: cache-based pipeline is not something mandatory to use, the old-school linking of .blend files will still stay here (for small projects or some trivial cases in the big projects). Cache-based pipeline is only to speed up performance of shots in a really big scenes. Using caches for small projects is not beneficial anyway because artists will have overhead of maintaining cache to always be up-to-date.

Defining what is to be baked to cache

The proposal is to have a collection property which indicates that this collection generates the cache. Everything in this collection gets baked to cache (by default, all geometry excluding materials).
The cache will also contain some “meta-information” about where it came from – which scene, which collection and such. This way we can always go back from cache to the original file and re-bake it.

Crucial part of dealing with caches is to know whether cache is up to date or not. To help artists here cache will store a modification timestamp of the file it came from, so everyone in the pipeline can immediately see when cache runs out of date (Blender will show warning message or nice warning icon in the header of cache collection).

Still to be decided: in a BAM-pack pipeline where artists are only checking out files and assets required for a particular task, it’s possible that files used to generate particular cache are not a part of that BAM pack. How to check whether cache is up to date then?

Using the cache

When one adds a baked cache to the scene, it links as a collection. For this collection we can visualize what objects are in there (because Alembic cache has a notion of objects, so we can deduct internal structure of the cached scene). Of course, all the properties of that objects are grayed out because the data is considered linked.

Since cache knows where it came from, cached collection will have a button to switch from cached data to visualize real data from the corresponding file. This will re-load parts of the scene and ideally (if the cache is all up to date) artists will see same exact scene but will be able to interact with real Blender objects.

Improving Alembic support in 2.79 and 2.8

$
0
0
Alembic logo

Alembic logoBlender 2.78 saw basic Alembic support for both import and export. The initial patch was provided by DwarfLabs, and was then integrated into Blender by Kévin Dietrich. For the previous month, I (Sybren) have been working on bringing Alembic to the next level, and that work isn’t done yet. This means that things are going to change; for the better, of course, but it may also mean requiring a re-export of your Alembic files. In this blog post I want to sketch my plans, and collect feedback from you, our dear Alembic users.

Broad plans

The final goal of my work is to be able to take an animated character, like Hendrik IJzerbroot a.k.a. Agent 327, and export it to Alembic for further processing in either Blender or other software. Furthermore, we want to be able to take animated characters from other software and load them into Blender. This includes support for:

  1. Instanced geometry
  2. Dupli-groups and dupli-objects
  3. Hair and particle systems

Furthermore, we want to change the workflow so that it will be easier to re-export characters by coupling export settings and collections of exported objects, giving the artist more control over what is exported exactly, and automatically (re-)apply certain modifiers and materials upon import.

Part of this work can be done for Blender 2.79 already, while other parts will have to wait until Blender 2.8.

Ideas for Blender 2.79

Blender 2.79 should be a stable release. As such, I’ll only try and do bugfixes and non-invasive improvements here. For example, I’ve fixed some bugs with coordinate transformations (we use Z=up in Blender, but Y=up in Alembic files). As a result, once those fixes are in the master branch (and thus in Blender 2.79), files will export & import differently. If you exported your files from Blender, you may have to re-export them to have any benefit of those fixes.
My questions to you are:

  1. Is it okay for you if you have to re-export your Alembic files when switching to Blender 2.79?
  2. What does your pipeline look like? What software do you use to generate and import your Alembic files?

More detailed plans are available on my personal wiki page.

Ideas for Blender 2.8

Blender 2.8 will introduce the concept of collections, which we could use to determine which objects get exported to which Alembic file. Furthermore, these collections could contain information about which modifiers to bake into the mesh (“skip the final subdivision modifier”), or to toggle between render-enabled and viewport-enabled modifiers.

We also want to work on a tighter integration between Blender’s data model and Alembic (or other cache formats), to be able to stream data from Alembic into the scene, without requiring the complex setup of constraints and modifiers that the current importer sets up. Together with other Blender devs we’ll look into an improved in-memory caching system for Blender 2.8.

I’ve already been working on supporting instanced geometry (i.e. multiple objects using the same mesh data) and dupligroups (i.e. empties that duplicate a group). These changes are more likely to be included in Blender 2.8 than 2.79, since they really introduce new features and should be tested thoroughly before release.

My question to you: what are important features for your workflow?

Conclusion

Work on Alembic support in Blender is an ongoing process. What is really important at this moment, are your feedback and actual production-level Alembic files. We need both to be able to support actual film production scenarios. Of course we will treat all files as confidential, although we would also welcome some files we can use in public test cases & benchmarks.

Please leave your feedback here in a comment or mail me at sybren@blender.studio. To send files, send me a link or send them through a service like WeTransfer.

Eevee Roadmap

$
0
0

In the last post about the Viewport Plan of Action we ended briefly covering the upcoming realtime engine of Blender 2.8, nicknamed Eevee. Eevee will follow the (game) industry PBR (Physically Based Rendering) trend, supporting high-end graphics coupled with a responsive realtime viewport.

Sci-fi armor by Andy Goralczyk – rendered in Cycles with outline selection mockup, reference for PBR

At the time it was a bit early to discuss the specifics of the project. Time has passed, the ideas matured, and now is time to openly talk about the Eevee Roadmap.

Scene Lights

The initial goal is to support all the realistic Blender light types (i.e., all but Hemi).

We will start by supporting diffuse point lights. Unlike the PBR branch, we will make sure adding/removing lights from scene doesn’t slow things down. For the tech savy: we will use UBO (Uniform Buffer Object) with the scene light data to prevent re-compiling the shaders.

Next we can support specularity in the shaders, and expand the light support to include area lights. The implementation implies we expands the GGX shader to account for the UBO data.

We will also need to rework the light panels for the Eevee settings since not all settings make sense for realtime.

Soft shadow

We all like our shadows smooth as a baby butt. But realistic smooth shadows are computationally expensive. For the realtime mode of Eevee we will follow the shadow buffer implementation we have in Blender now. For offline Eevee rendering  (i.e., playblast) we can crank that up and raise the bar.

Regular Materials

Uber Shaders

Things are lit and well, but we still need materials to respond to it.

We will implement the concept of Uber shaders following the Unreal Engine 4 PBR materials. Since Eevee goal is not feature parity with UE4, don’t expect to see all the UE4 uber shaders here (car coating, human skin, …).

An Uber shader is mainly an output node. For this to work effectively we also need to implement the PyNode Shader system. This way each (Python) Node can have its own GLSL shader to be used by the engine.

UI/UX solution for multi-engine material outputs

Multiple engines, one material, what to do?

A material that was setup for Cycles and doesn’t have yet an Eevee PBR Node should still work for Eevee, even if it looks slightly different. So although we want to support Eevee own output nodes, we plan to have a fallback solution where other engine nodes are supported (assuming their nodes follow the PyNode Shader system mentioned above).

Convert / Replace Blender Internal

After we have a working pipeline with Eevee we should tackle compatibility of old Blender Render files. That said, the Cycles fallback option should be enough to get users to jump into Eevee from early on.

Advanced Materials

More advanced techniques will be supported later, like:

  • SSS
  • Clear Coat
  • Volumetric

Image Based Lighting

We will support pre-rendered HDRI followed by in-scene on-demand generated probes. This makes the scene objects to influence each other (reflections, diffuse light bounce, …).

We need the viewport to always be responsive, and to have something to show while probes are calculated. Spherical harmonics (i.e., diffuse only) can be stored in the .blend for quick load while probes are generated.

Time cache should also be considered, for responsiveness.

Glossy rough shaders

Agent 327 Barbershop project by Blender Institute - rendered in Cycles, reference of Glossy texture in wood floor

Agent 327 Barbershop project by Blender Institute – rendered in Cycles, reference of Glossy texture in wood floor

We can’t support glossy with roughness reflections without prefiltering (i.e., blurring) probes. Otherwise we get a terrible performance (see PBR branch :/), and a very noisy result.

Diffuse approximation

Visual representations of the first few real spherical harmonics, from wikipedia

There are multiple ways to represent the irradiance of the scene, such as cubemaps and spherical harmonics.

The more accurate way is to use a cubemap to store the result of the diffuse shader. However this is slow since it requires computing the diffusion for every texels.

A known compromise is to store low frequency lighting informations into a set of coefficients (as known as Spherical Harmonics). Although this is faster (and easy to work with) it fails in corner cases (when lights end up cancelling themselves).

Eevee will support spherical harmonics, leaving cubemaps solely for baked specular results.

Probe Objects

Like force fields, probe should be empty objects with their own drawing code.

Environment map array

Reference image from Unreal Engine 4, room design by NasteX 

Large objects (such as floor) may need multiple probes to render the environment correctly. In Unreal environment map array handles this on supported hardware. This is not compatible with OpenGL 3.3 core. We can still support this (via ARB extension) on modern graphic cards. But we should look at alternatives compatible with old and new hardware, such as tetrahedral maps.

Post Process Effects

For the Siggraph deadline we need the following effects:

  • Motion Blur
  • Bloom
  • Tone Map
  • Depth of Field
  • Ground Truth Ambient Occlusion

Other effects that we would like to implement eventually:

  • Temporal Anti-Alias (to fix the noisy fireflies we get from glossiness)
  • Screen Space Reflection (more accurate reflection, helps to ground the objects)

Epilogue

Blender 2.8 Viewport design mockup by Paweł Łyczkowski, suggesting fresnel wires over a Cycles preview

Just to re-cap, the core of the features mentioned here are to be implemented by Siggraph 2017, with a more polished usable version by the Blender Conference.

The viewport project (which Eevee is a core part of it) development is happening in the blender2.8 branch. It is still a bit early for massive user testing, but stay tuned for more updates.

Edit: You can find the Eevee roadmap organized chronologically and with a more technical wording at the Blender development wiki.

Blender 2.8 Workflow Workshop Video

$
0
0

Here is a little video documentary of the Blender 2.8 Workflow Workshop that took place in Amsterdam between 25 and 27 November 2016.

It’s great to see that after a few months since that meeting Blender 2.8 development is picking up quite some speed!

Viewport: Past, present and future

$
0
0

It’s been too long. Do you remember Eevee? A lot of development happened in the past month, so listen up.

Big picture: What we did so far

HDRI, PBR, realistic light shadows, post process effects, probes … all of this in real-time, integrated with object mode, mesh editing, …

Do you want to see a glimpse of this while it evolves? Follow the Blender developers video channel:

We demo’ed Cycles integrated with the viewport. Eevee reflections and bloom. And a brand new hair drawing system – fast, smooth, and good looking.

Plans for Siggraph

At the Blender booth at Siggraph (1-3 August) we will show a demo of Blender 2.8. The focus will be on fast playback of realistic characters and environments in real-time.

By then we should also have: Indirect light, reflections, transparency, and tons of performances improvements.

Buildbot

Feeling the hype? Get hands-on with the latest builds. Fresh every day at builder.blender.org/download!

Community tests & call for help!

Thanks to the wonderful community, we already have early adopters and beta-testers spreading the love online.

Have any work from your own that you want to share? We really welcome and are looking forward to see more examples and use cases.

Bloom / Blender 2.8 (Test build) Eevee PBR Engine RealTime by Юрий Жестников

For developers willing to join the project, check Eevee’s roadmap and the more technical viewport optimization document.

After Siggraph

The fun doesn’t stop there. We expect more people to try Blender after all the demos. Which means the focus will be on usability, and polishing.

You should see Cycles fallback, Blender Internal conversion, and advanced features such as SSS and Volumetric.


Shiny creatures – early Blender 2.8/Eevee tests by Andy Goralczyk (@artificial3d)

Grease Pencil 2.8 sneak peek

$
0
0

Daniel gives us a quick sneak peek on the new Grease Pencil in the upcoming Blender 2.8

The new Grease Pencil’s main focus is to create a more friendly interface for the 2D artist, while keeping the advantages of having a full 3D suite underneath. Grease Pencil is no longer just a stroke, it’s now a real Blender object with huge improvements to brushes and tools.

Credits go to the Grease Pencil developers Antonio Vazquez and Joshua Leung. While Daniel Martinez Lara and Matias Mendiola support the developers with demos and testing.

This development is currently happening on the ‘greasepencil-object’ branch, based off 2.8, and it will be merged soon so everybody can test using the build-bot.

Improvements to the Cloth Simulator

$
0
0

Here we take a look at the most significant developments in the cloth simulator from the past few months. This work was based on the proposal found here. The code developed during this time is at the moment still in its own branch, but will be merged at some point during the 2.8 series.

Mass Spring Model

With the old mass-spring model used by Blender, the user could have no independent control over bending and compression, or stretching and shearing. As shown in the image below, the tension and shearing springs were coupled together.


Tension and shear springs shown in blue, and compression/bending springs shown in red. (springs of same color translate to a single property in the UI)

Also, a single linear spring was responsible for both bending and compression resistance, as can be observed in the animation below. Furthermore, the usage of linear springs for resisting bending, deviates significantly from actual cloth behavior, and even allows bends to be flipped to the opposite direction without any resistance.


(Cross-sectional view) This video shows that the same spring was responsible for providing both compression and bending resistance, by pushing opposite sides of a bend away from each other.

The new model has componentized springs, and the addition of angular bending springs. Combining these things, the user has complete control over every aspect of the cloth stiffness independently.


Tension springs shown in blue, compression springs shown in red, shear springs shown in cyan, and angular bending springs shown in green. (springs of same color translate to a single property in the UI)

Beyond that, the angular springs behave much more realistically, and are signed, meaning that the cloth is aware of the direction of the bend, thus not allowing it to get flipped.


(Cross-sectional view) Here in the new model it can be seen that a new dedicated angular spring was added specifically for bending, while the compression springs always stay flush with the cloth surface.

Below are comparisons of various aspects of the cloth model that were improved.


Comparison of the old cloth bending model (left) and the new model (right).


Comparison of the old combined isotropic cloth tension model (left) and the new componentized anisotropic stretch/shear model (right).

Plasticity

Plasticity is the most significant property of deformable materials that was missing from Blender’s cloth simulator. Plasticity is the property in which materials retain deformations after being subjected to stresses, and thus don’t return completely to their original shape. In addition to the improved mass-spring model, the inclusion of this property, is the final piece in enabling the simulation of virtually any known deformable sheet material.


The same simulation shown without plasticity (left), and with plasticity (right).

External Rest Shape

The “Dynamic Besemesh” feature allows the underlying cloth mesh animation to be used as the new rest shape on each frame, thus enabling effects where the cloth changes shape throughout the simulation. However, an issue arises when this feature is used in combination with pinning. Because the pins also utilize the underlying mesh state to determine vertex locations, one could not have independent control over the pin locations and the dynamic rest shape. Now support has been added for using another mesh (with identical topology) as the rest shape.

The video below demonstrates the described issue, and shows how one can now animate rest shape and pin locations independently.


From left to right: (Top row) Animated cloth mesh (before simulation); Simulated cloth using own mesh both for pinning and as dynamic rest shape; (Bottom row) Animated mesh used as rest shape; Simulated cloth using external rest shape mesh;

Eevee Live Q&A

$
0
0

Join Dalai Felinto and Pablo Vazquez streaming live from the Blender Institute, answer questions and demoing Eevee’s latest features.

Modifiers in Grease Pencil 2.8

$
0
0

Developer Antonio Vazquez has just implemented the first modifiers for Grease Pencil objects.
Check out this demo by Daniel Martinez Lara:

The new modifiers shown in this video:

  • Array
  • Lattice
  • Simplify
  • Noise
  • Tint
  • Hue Saturation Value
  • Blur

Even though more modifiers are planned, focus now is to squash as many bugs as possible before the merge with the main Blender 2.8. This development is currently happening on the ‘greasepencil-object’ branch.

PS: Watch it until the end for a nice surprise!

View Layers and Collections

$
0
0

At popular confusion request we are shedding some light in the new View Layers and Collections in Blender 2.8.

Where are the Objects?

In Blender, objects are not directly part of the scenes. Instead, they all get stored in a main database (basically the .blend file). And from there they are referenced into as many Scenes you like to use.

 

In 2.7 the Scene simply referenced its objects directly, as a single list. In 2.8, however, all the Scene objects are part of a new concept: the “master collection”.


Collections

While the master collection contains all the Scene’s objects, the user can also make their own collections to better organize these objects. It works like a Venn diagram, where all the objects are part of the master collection, but can also be part of multiple collections.

 

The result is a clear and flexible way to arrange objects together on the Scene level.

Naming and Nesting

Collections can be named and sorted hierarchically. Just like folders can have sub-folders in any operating system, collections can have nested collections too.

For example: a house collection can contain a bedroom collection, which in turn contains a furniture collection referencing a bed, a cabinet and other objects.


So far we discussed managing a Scene, organizing it as collections of Objects. Now we look at how to put the collections to an actual use. That is where the new layer system comes into the picture.

View Layers

In 2.8 the visibility controls are part of View Layers, designed to help organizing what you want to see or work on.

 

View Layers (or ‘views’) reference to Scene collections, and allow to set their visibility, selectability and other options. Each View Layer can use any collection you wish, and multiple View Layers can use the same collections or different collections.

In that sense this new View Layer is similar to the 2.7 “render layer” – but since View Layers are now part of how we work with workspaces and viewports we decided to name it differently.

An exciting future option for the 2.8x roadmap is to use View Layers as compositing units, also for real-time rendering. For simplicity now, we stick to having a single ‘active View Layer’ per Blender window.

 


That’s all folks

We will soon implement an interface in the viewport and the properties editor to give the collection and view layers all the attention they deserve.

In the meanwhile, keep an eye on the Blender 2.8 project and don’t forget to check the daily builds!

Post by Dalai Felinto, Pablo Vazquez and Ton Roosendaal.

Blender 2.8 Design Document

$
0
0
  • Goal
  • Big picture
  • View layers and collections
  • Workspace
  • User interface
  • Engine and draw manager
  • Asset management
  • Dependency graph

Goal

This document is for Blender developers and other contributors to the 2.8 project. It is meant to provide focus and general agreement on technical design decisions. It will allow to empower everyone to contribute more efficiently. UI and further usability design of features will happen after this, as an ongoing project.

“Blender 2.8 will enable artists to work faster and more efficient. Emphasis is on helping specialists or to enable specialist tasks better. 2.8 will allow Blender template/configurations to start with reduced or different user interfaces.”

The idea is to optimise one’s personal workflow better, instead of trying to maintain a single configuration that attempts to provide all the possible tasks at the same time for everyone.

Big picture

Each Blender window has all of its editors sharing the same state. That means there is always a single active Scene, visible objects, drawing engine, (editing) mode, and render engine. The render and viewport settings are being shared by all the editors.

More settings and viewport overrides might be brought in order to accommodate artist feedback. This feedback will be processed after the intended design has been implemented.

Targets

The Blender 2.8 workflow release original plan was rather ambitious. To make sure the development doesn’t drag forever, and everyone knows what we are working towards we will define the 2.8 project goal and targets as:

End-user targets

  • View layers, collections and overrides
  • Workspaces
  • Top bar with global tool area, and headers
  • Blender 101 – optimize the interface for specific tasks
  • Tool system and manipulators
  • PBR viewport render engine (Eevee)
  • General purpose engine (Workbench)
  • Pipeline for complete 2d animation with grease pencil
  • Amber asset engine (basic, local asset management)
  • Everything proxyable (static overrides)

Technical targets

  • Replace legacy OpenGL with newer OpenGL (3.3 core for now)
  • Replace Derived Mesh (internal modifier storage) with new system (using new Dependency Graph)
  • Interleaved global/local undo stack (material changes in Edit Mode are currently not registered by Undo)
  • Static override system
  • Asset engine API
  • Improved dependency system
  • Draw manager
  • Interface templates
  • Separation of existing grease pencil annotation tools and grease pencil objects

Orphan targets

Targets lacking developer-power

  • Rewrite particle system
  • Multi-object editing

Undefined targets

Targets with vague design to be discussed further

  • Workspace own add-ons and keymaps
  • Game engine

Not targets

Targets better left alone for Blender 3.0

  • Everything node
  • An in-depth redesign of our functional user interface

View layers and collections

The View Layer system for Blender 2.8 will integrate workflow and drawing, allowing for better Scene organization and viewport control.

Please read this if you’re not familiar with the very powerful new concepts for 2.8 layers.
https://code.blender.org/2017/09/view-layers-and-collections/

Workspace

Workspaces allow users to build up a working environment for a specific task.

Workspaces are simply a wrapper on top of the Blender Screens with extra added functionality such as drawing engine, view layer and viewport settings. It might also include custom keymaps and active add-ons.

Overlay engine

In 2.8 all of the edit modes are part of an overlay engine system and their settings are all stored in the Workspace. For example: “show normals”, “brush size”, “objects origins”.

This includes all the tool and display related settings. Some settings are still allowed to stay per-object (e.g., motion path and ghost settings), but whether to display them belongs to the Workspace.

The overlay engine draws on top of the Viewport’s own drawing engine. It’s not part of the final render. This enables for example Object Mode and Edit Modes in a viewport rendering Cycles.

Viewport settings

Some hand-picked render settings will be exposed to the Workspaces as well as Scene, we are calling them viewport settings.

For instance AO distance depends on Scene scale, and would be an annoyance to set per Workspace. However the number of samples, or whether to use AO are a perfect fit for a Workspace.

View Layers can have specific viewport settings overrides (as well as render settings).

Engine and view layer

The engine, View Layer and the viewport settings are set per Workspace, and therefore affect the entire window. However the final render (which is a Scene setting) is not influenced by it.

This way a file opened without UI will not lose its render settings. For that we store an active View Layer and engine in both Workspace and Scene data blocks.

Use scene settings

For viewport drawing, Workspaces have an option to use the Scene render settings (F12) instead of the viewport settings.

This way users can quickly preview the final render settings, engine and View Layer. This will affect all the editors in that workspace, and it will be clearly indicated in the top-bar.

Active mode

The Workspace stores a mode, instead of having it per object. That means if you switch the active object in a Workspace that is in Sculpt Mode, the new active object will be in Sculpt Mode as well.

The undo stack will store a mix of global and local stacks, to be able to flush the edited data when needed. For example when two windows have the same active object in different modes, and you alternate between them to do sculpting or mesh editing.

If syncing the same object in different modes becomes an issue, we switch the workspace to the object mode, keeping the original active mode as the go to mode when toggling back.

One of the driving forces for this feature is multi-object editing. This may not be implemented in time for the first 2.8 release though.

User interface

The Blender 2.5 design is still valid, we are not refactoring the entire Blender user interface. That being said, we do need to implement the following changes.

Top bar

The top bar will be a global area at the top of each window. It is a place for extended settings, and tools.

All the editors follow the top bar. That means per window we have one Scene, one View Layer and one engine used everywhere.

Templates

Apart from the always present top-bar, we will allow most of the Blender interface to be customizable. This is also known as the Blender 101 project.

The Blender 101 project goal is to be able to create a simplified version of Blender for specialized work such as:

  • 3d printing
  • Architectural planning
  • Movie previsualization

A template can be created with Python. Starting Blender with a template will only make UIs and options available as defined by the template.

Optionally, the template can allow a complete Blender to be accessible.

Workspace

The Workspace settings will be split across:

  • Top bar: layer, engine, mode
  • Properties editor: render settings
  • Viewport: tool settings

The tool settings are part of the overlay engine settings and thus stored in the workspace and global for the entire window.

Collections

A panel for collections will be present in the viewport and the properties editor for the Scene and Workspace tabs.

And we may need a tiny collection enable/disable template in the viewport header. We have no design yet but the goal is to mimic the simplicity and ease of use of the 2.7 visibility bits.

To edit the collection members we will use the outliner.

Manipulators

The manipulators and operators widgets bring the work to the 3d viewport. These are the main guidelines when designing manipulators:

  • All interactive elements of a manipulator should be visible (e.g., not only on cursor hover).
  • They should be toggleable in editors like UV. In dopesheet the box selection is more unanimous so widgets can be more omnipresent.
  • The exception of hidden information can be changing cursors based on the possible action (translate up, down, all directions, rotate, scale).
  • The whole widget area shouldn’t be an action area, such as panning widget.

Manipulator widget design

Collecting the cases for 2D manipulator will lead to create a new interaction language in Blender. For 2D it will likely converge to what we see in any other 2D drawing software: such as Google Draw and Inkscape.

For 3D there are fewer open source references to study from, so we will have to extrapolate from the 2D designs and create our own.

Established use cases

Paradigms to consider for the manipulators design:

  • Using tablet as main input device
  • ‘Face mapping’ rigging and posing
  • Virtual reality work – 3d manipulators for tools and operators

Tool system

The non-modal tool usage will be supplemented using manipulators. Instead of having to execute a tool from a shortcut, it will additionally be possible to execute it using visual handles.

The tool-shelf will use this heavily: Selecting a tool from the tool-shelf will activate its manipulators if available. The active tool will have some of its options exposed in the top bar as buttons, as well as its own widgets and events.

Only a single tool is active at a time, and transformation manipulators are only visible when no tool is active.

For example: Extrude tool manipulators mockup.

Engines and draw manager

The draw manager allows to compose overlay engines (the editing mode engines – i.e. object, mesh, sculpt, paint) on top of a render engine.

The concept of a render engine is to allow it to work interactively in viewports, and to work as an offline engine (e.g. Cycles). That means the new real-time engine Eevee can also be used as a production render engine.

Eevee engine

Eevee is a new real-time engine which supports high-end PBR graphics coupled with a responsive realtime viewport.

See: https://code.blender.org/2017/03/eevee-roadmap/

Workbench engine

This is a complete engine designed for modeling. It could be a beautiful white canvas with floor shadows for modelers to work. It does not aim at realism, but at the right visualisation for a certain task, and especially to work perfect with the overlay engines.

Grease Pencil

Grease pencil becomes consolidated as an annotation tool, while a full fledged 3d storyboarding system is implemented with the new grease pencil objects.

See: https://wiki.blender.org/index.php/User:Antoniov/Grease_Pencil_as_Object

Although technically this is a new engine, it’s currently an integral part of the overlay system. This way the 2D drawings can be combined with any other engine. We keep this as the only the exception to the rule, as a prelude of an upcoming viewport compositing system.

Asset management

We build an integrated system to handle files, projects and user’s asset libraries to allow browsing and importing data blocks, textures, and so on.

See: https://code.blender.org/2016/10/asset-management-and-pipeline/

Dependency graph

The dependency graph is responsible for dynamic updates of the Scene, where values vary over time – e.g. to enable animation. The new dependency graph was designed from scratch to allow multi-threading, efficient duplication and overrides. It uses a new data model for scene data, replacing the old ‘Derived Mesh’ from the modifier system.

Ultimately this new system will even allow to view a Scene on a different states in different windows. For example, two windows where one plays animation, and the other is on a still frame.

See: https://wiki.blender.org/index.php/Dev:2.8/Source/Depsgraph

Everything proxyable

Any data block (especially referenced ones from external files) will be able to have local properties. And as far as the tools and editors in Blender are concerned, the linked data with its modifications will be treated the same as any local data block.

See: https://developer.blender.org/D2417

This also means that the Armature Proxy will go away, and get replaced with this new system.

Overrides

View layers and collections can override draw and data block options. Those overrides are local to the view layers, but handled as part of the dependency graph rewrite.

See: https://wiki.blender.org/index.php/Dev:2.8/Source/overrides


Design document signed off by: Bastien Montagne, Brecht Van Lommel, Campbell Barton, Dalai Felinto, Jonathan Williamson, Julian Eisel, Pablo Vazquez, Sergey Sharybin, and Ton Roosendaal

Cycles Benchmarks – Nvidia update

$
0
0

In-between all other duties here in Blender Institute, Sergey Sharybin is building a 24/7 test farm running Blender’s daily builds to check on various hardware configurations. The tests will be for Cycles renders at first, but will include other tests too.

Purpose is that the daily stats + images will get published online automatically, with a nice website UI and history browsing. (And that’s most of the work, which is a reason to be a bit more patient for now!).

As a sneak peek, here’s the last Cycles stat with Blender 2.79 (release). It includes four new GPUs – provided by Nvidia two weeks ago.

Note that the super fast Quadros are quite expensive! However, prices change all the time – not sure if we should include such information in stats overviews. Just google it.

(Click on image for more info and a live web page)


-Ton-

Viewing all 177 articles
Browse latest View live