Re: [apache/tvm-rfcs] [Process RFC] Empowering New Scoped Module to the Project (PR #95)
Hi all, as suggested in the thread, we held this thread for a while. And now it can be a good time to come back. Let me summarize the previous discussion here: - Scoped module A scoped module (S0-module) can be: > - Clearly isolated in its own namespace. > - Clearly needed by some users in the community. > - No disruptive change to the rest of the codebase > - Can be easily deprecated by removing the related namespaces > - Can be turned off through a feature toggle to contain the overall dependency from the rest of the modules. - voting mechanism Establishing a scoped module is not a typical code change, which needs to get majority support from PMC. This kind of voting mechanism is also used in other Apache Projects (e.g., [Apache Hadoop process](https://hadoop.apache.org/bylaws.html) and [Apache Hive Bylaws](https://cwiki.apache.org/confluence/display/Hive//Bylaws) - The community should evaluate the scoped module with a variety of factors, wearing the project hat: > - Fit into the overall project and rest of the modules and project. > - Test strategy, modularization, and documentation. > - The scope of impact of the added module, and levels of open-mindedness. > - Competitive landscape of the overall MLC space and enablement of the project towards goals that are not supported atm. > - Community empowerment in general: e.g. contributors who would become an added force of offset development complexity, and also in a lot of cases contribute to other existing modules. - Scoped module does not mean to be a low-quality module. All code changes for scoped modules are in the same code review mechanism Please let me know if I miss any public voice and considerations. And let's continue on discussion and finalize the RFC. -- Reply to this email directly or view it on GitHub: https://github.com/apache/tvm-rfcs/pull/95#issuecomment-1336400761 You are receiving this because you are subscribed to this thread. Message ID:
Re: [apache/tvm-rfcs] [Process RFC] Empowering New Scoped Module to the Project (PR #95)
Thank you, everyone, for the discussion so far, and thank you, @Hzfengsy for driving the proposal. After reviewing the thread and watching all the relevant conversations. I would like to come out and share my support for the proposal. I am making this statement [wearing Apache TVM hat](https://tvm.apache.org/docs/contribute/committer_guide.html?highlight=apache%20tvm%20hat#independent-project-management). I would like to summarize a few important perspectives from our community members here as guidelines for community members: ## How should we operate as a community? When considering proposals, there are many factors being considered by members. A lot of them have to do with the impact to code: - Impact to Code: cohesiveness to the overall vision, testing, clarity among modules, quality, stability, the overall complexity of the codebase, and presentation to the users. This is what we are getting used to in normal software development settings. There are even trade-off factors here. On the one hand, indeed, new modules bring different levels of impact on these perspectives depending on their scope. However, in many cases, new modules do not bring impact in a purely negative way. MLC(machine learning compilation) is a fast-evolving field. The ability to reinvent ourselves and keep up with innovation while keeping the main component stable is the key to survival in the ML landscape. PyTorch is exemplary in terms of development philosophy on bringing in related modules. In many ML frameworks such as PyTorch, allowing space for multiple relevant solutions(e.g. Dynamo, TorchScript, LazyTensor) to coexist and evolve is critical for success. It is also important to allow flexibility of growth rather than plan everything out in the beginning. When TorchFX first came out, there was no clear prediction of Dynamo being relevant. Even today, there is no detailed plan of replacing TorchScript with Dynamo. Nevertheless, the community members continue to use the relevant modules and have an evolving conversation. ML is fast evolving and we should never expect to lock down every aspect of a plan in the beginning. Instead, open doors for diverse community members to come and participate and contribute their perspectives. Such open philosophy is what the TVM community(or every successful ML OSS project) should aspire to be. This brings us to another **very important** factor, which is OK to ignore in software development settings but should always but put first in Apache projects: - Impact to the community: bringing in new community members and growing the community. A lot of that ties back into the "community over code principle". When evaluating modules, we should certainly consider how the proposal empowers new users/developers. Can the proposal welcome new developers? Can they solve different needs of the community? The TVM project exists because of the community. That means we need to empower each member with different needs, and sometimes that could mean that we would do something different than our normal development rationales to empower the community members – especially when many, or even the majority of community members, shared their voice and support. This does not mean our original rationale is wrong, but we are more open-minded and bring that community empowerment as the first principle. ## How can we empower each other Being in the same community means we share the same burdens(of maintenance, advocations) and joys together. Any process or system is not perfect, and the intention was never to assume people as adversarial. The first principle of the Apache community is community over code, which means we empower each other. Some in the discussion thread raised the question about what if a person operates in an outright stubborn manner – like declaring outright rejection of later S1/S2 proposals or code PRs without reviewing the additional context being brought up, and not having a constructive conversation with additional context. Such behavior would certainly be frowned upon in any OSS community, and this is not a behavior I would expect from any committer/reviewers. This also comes from the exclusive mindset and inclusive mindset we have in the community. Indeed it is an additional burden for us to help maintain, and development the code that is less relevant to our interest. One could outright dislike code that causes regressions irrelevant to our own customer use cases. It is harder to directly settle a lot of those conversations through a simple technical debate since many can have disagreements about what are important parts – to some graphs IR and executor are important, to many other TVM users, they might only care about FFI and TensorIR while building their own graph integration. When in confusion, always remember to come back to the community. Remember that we empower each other. These are codes that are developed by membe
Re: [apache/tvm-rfcs] [RFC] Relax Upstreaming (PR #89)
After seeing so many voices in this thread. I think it is important to provide a reply here. I am wearing the Apache TVM hat as a ASF member and Apache TVM PMC member. First of all, I would like to say thank you, everyone, for sharing your voices here. This post has received support from more than eight organizations from both industry and academic backgrounds. Your voices are very important to the community and will not be ignored. As many said, we would love the TVM community to continue being inclusive and innovative while maintaining the stability of existing developed components. I also would like to come out and acknowledge the positions so far: The position that @leandron made so far was: - We do not like to be in a state where relax and relay coexist without deciding the commitment of one replacing another. - As a result, due diligence of such a replacement is mandatory before merging the proposal. I would like explicitly to acknowledge that the above positions have valid rationales, are completely valid, and can be a possible way of software development. I think the position raised by @YuchenJin and others were: - Relax could have the potential to replace relay, but the proposal as it only proposes to have the two modules coexist. - Just like how most OSS projects bring in modules and evolve things (e.g. TorchFX being brought in overlaps with TorchScript, nor plans to immediately phase out TorchScript). The modules can coexist, evolve, and we continue conversations about future co-evolution. - Relax and Relay coexist in the codebase is already a positive step that we shall take, especially considering community empowerment. These are also valid rationales and can be possible ways of developing things. As a first step, I would like to acknowledge each others’ positions as they are valid rationales. The main difference is that there is a disagreement on how we should do things as a community. Such a decision should be made collectively as a community, considering all the factors involved: including code and community factors. We all make our suggestions holding innovation, stability, and community into account. When evaluating a proposal and empowering our community members, we expect every one of us to continue having a constructive conversation, considering the latest context. While the initial comment made by @leandron is valid on its own, I would love to see we re-evaluate our positions message considering all the factors in the latest context, including community empowerment and the collective views of other members. I want to say that by no means do we simply seek to dismiss the original position -- i would apologize if I it makes it feel that way. Instead, we want to acknowledging each view, and we have disagreements on hows, and taking community into consideration. I think we should continue to have constructive conversations in services of many who have voiced their support here. Thank you! -- Reply to this email directly or view it on GitHub: https://github.com/apache/tvm-rfcs/pull/89#issuecomment-1336475126 You are receiving this because you are subscribed to this thread. Message ID: