CFP (演讲提案征集)评分指南

+ 最佳实践

CFP SCORING GUIDELINES

+ BEST PRACTICES

上海世博展览馆 Shanghai Expo Centre   |   中国上海 Shanghai, China

KubeCon + CloudNativeCon 2019 中国论坛

CFP (演讲提案征集)评分指南 + 最佳实践

我们衷心感谢您作为项目委员会成员为 2019 年 6 月 24 日至 26 日在上海世博中心举行的 KubeCon + CloudNativeCon 中国论坛所做的努力。

以下是您审查演讲提案时使用的官方 CFP 评分指南和最佳实践。请为此页添加书签,以便参考。如果您有任何问题,请发电子邮件至 Nanci Lancaster

请点击此页面上的标签,以查看不同的信息。

如果您希望查看这些信息的英文版本,请向下滚动越过中文标签,直至看到英文版本。

 

KubeCon + CloudNativeCon China 2019

CFP Scoring Guidelines + Best Practices

Thank you in advance for your efforts as a member of the Program Committee for KubeCon + CloudNativeCon China, taking place June 24–26, 2019, at the Shanghai Expo Centre in Shanghai, China.

These are the official CFP Scoring Guidelines and Best Practices to use when reviewing your set of proposals. Please bookmark this page for easy reference. If you have any questions, please email Nanci Lancaster.

Please click through the tabs on this page to access information. 

If you wish to view this information in English, please scroll down past the Chinese text tabs until you see the English versions.

重要日期 + 截止日期

重要日期 + 截止日期

  • 必须 100% 审阅分配给您的提案:2019 年 3 月 4 日,星期一
  • 会议日程发布:2019 年 4 月 3 日,星期三
  • 会议活动举办日期:2019 年 6 月 24 至 26 日

评分指南

评分指南

按照 5 至 1 的评分标准,对每个演讲进行评分:

  • 5 (强烈接受)
  • 4 (接受)
  • 3 (一般)
  • 2 (拒绝)
  • 1 (强烈拒绝)

提醒:您需要为您审核的每个提案留下评论,并且详细说明您的评分理由。

审查流程最佳实践

审查流程最佳实践

  • 时间承诺:请计划共计投入 4-8 小时,来审阅您的主题下提交的所有申请。将每次审核 10-15 个演讲为目标,然后休息/离开。这有助于防止倦怠,让您用新鲜的眼光查看更多的提案。
  • 流程完整性:请务必对提交的材料和您所作的评论保密,保护审查过程的完整性,避免不必要的偏见至关重要。请阅读并遵守我们的 行为准则
  • 公众和作者互动:为了确保公正的审查过程,项目委员会成员不应与作者和/或公众讨论提交的材料(即,请不要发送推文)。当然,一旦公布时间表,可以随时发送推文,告知您很高兴参加入选的演讲。
  • 利益冲突:评审员需要代表“KubeCon + CloudNativeCon”、而非公司或其他附属机构的利益,以便您可以对所有提交的作品进行公平的评价。如果提案的作者是与您密切合作的同事,或者您认为作者与您有关联或存在竞争关系,请跳过,并标记为利益冲突。
  • 评审指标:如上所示,排名系统分为 5 个选项:5(强烈接受)、4(接受)、3(一般)、2(拒绝),1(强烈拒绝)。您要强调对您建议的信心程度,以及您给出分数的原因,这至关重要。 对提案进行审核时,请记住以下指标:
    • 相关性——与“去年”的信息相比,该内容是否提供了新的和令人兴奋的收获?内容是否与会议和所选渠道相关?
    • 原创性—其是否为原创的演示文稿,而非演讲者在每次会议上重复的演示文稿?内容的呈现方式是否原创?
    • 可靠性—内容在交付时是否有意义,还是泛泛而谈?演讲者是否似乎缺乏重点?
    • 演示的质量—提案是否引人入胜和经过深思熟虑?背景材料是否暗示演讲者会有效地进行演讲?
    • 重要性—内容对 KubeCon + CloudNativeCon 受众有多重要?
    • 经验—演讲者是否是演讲的理想人选?他们对主题的体验是否与提案内容一致?
  • 提交多份申请的演讲者:我们不太可能接受同一位演讲者的多次发言。如果您正在审查同一位演讲者的一个以上的强有力的提案,您可以只给其中一个演讲 5 分(强烈认同),为项目联合主席提供帮助。请用评审意见,说明为何偏好一个话题而非另外一个。
  • 评审意见:请记住,提交申请的作者可能是大公司的副总裁或大学生。确保您的反馈是建设性的,尤其是对于我们没有接受的提案,因为我们收到了反馈申请,所以,可能会传递一些意见(尽管我们不会将您的意见与您联系在一起)。评审要素的一些良好的范例包括:
    • 强调提案的积极方面。
    • 提供建设性的反馈,“如果……,将会很有帮助”,并在适用时包括事实。
    • 避免直接攻击,“他们的 YouTube 视频让我担心他们的发言风格”,而非“此人说话很糟糕”。
  • 小组讨论:理想的座谈小组由不同的思想领袖组成,80% 的时间发言,余下 20% 的时间与观众互动。审查座谈小组提交的材料时,需要记住的一些要点:
    • 小组是否多元化,是否有不同的性别组合?所有 KubeCon + CloudNativeCon 活动注释:所有座谈小组都要求有至少 1 名女性演讲者
    • 提交的材料是否有凝聚力,是否清晰地显示了 35 分钟内座谈小组的进展情况?他们能否在给定的 35 分钟内涵盖提案中的所有内容?
    • 他们是否包括任何样本问题?
    • 座谈小组是否包括来自不同组织的成员,包括主持人?
    • 如果需要,研究小组成员和主持人。他们的经历是否与主题相关?
    • 座谈小组的成员是否能够提供不同的观点,还是不断重复同样的内容?
    • 是否有知名的小组成员?
    • 在 1-2 名小组成员无法出席的情况下,会对座谈小组有何影响?
  • 分组会议:演讲由一位具有新鲜或独特观点的主题专家进行。在审查演示文稿提案时,请记住一些要点:
    • 提案书写得好吗?
    • 主题是否相关、原创,他们是否被认为是主题专家?
    • 他们是否在谈论自己公司的特定产品?如果是这样,其参与方式是否非广告性?请记住,作为公司宣传或商业信息的演讲在观众中的口碑通常很差。
    • 他们的目标受众是谁?摘要和描述是否与所需的专业知识相匹配?
  • 专题讨论小组 (BoF):专题讨论小组是一个讨论论坛,相关人员仅在 20% 的时间里推进,而 80% 的时间与观众互动。审查专题讨论小组时应记住的一些要点:
    • 提案书写得好吗?
    • 他们是否在谈论自己公司的特定产品?如果是这样,其参与方式是否非广告性?请记住,作为公司宣传或商业信息的演讲在观众中的口碑通常很差。
    • 他们的目标受众是谁?摘要和描述是否与所需的专业知识相匹配?
    • 人们分享经验的主题通常有助于呈现出众的专题讨论小组。在这些会议中没有专家,但互动倾向于非常频繁。

联系我们

联系我们

如果您在审查提案时需要任何帮助,或对评审流程和我们建议的任何最佳实践有疑问,请与Nanci Lancaster联系,寻求帮助。


IMPORTANT DATES

Important Dates to Remember

  • Must have 100% of your assigned proposals reviewed: Monday, March 4, 2019
  • Event Dates: June 24–26, 2019

SCORING GUIDELINES

SCORING GUIDELINES

Grade the quality of each proposal on a 5 to 1 grading scale for content, originality, relevance, and speaker(s):

  • 5 (Excellent)
  • 4 (Above Average)
  • 3 (Average)
  • 2 (Below Average)
  • 1 (Poor)

Reminder: You are required to leave comments for each proposal you review, detailing the reasoning for your score.

For each proposal, you will indicate whether or not you see it ultimately being part of the accepted program by stating “yes” or “no.”

If you come across a proposal that does not seem to fit in the topic you are reviewing, you will indicate which topic you think the proposal fits best in within an optional drop-down menu. Please still grade this proposal as you would any others within your review set.

 

REVIEW PROCESS BEST PRACTICES

REVIEW PROCESS BEST PRACTICES

  • Time Commitment: Please plan on committing 2-20 hours total to review all of the submissions in your track, depending on the amount you have been assigned. Aim to do 10-15 sessions at a time – then take a break / walk away. This helps prevent burnout and allows you to see more proposals with fresh eyes.
  • Process Integrity: It is very important to protect the integrity of the review process, and to avoid undue bias, by keeping the submissions and your comments on them confidential. Please review and adhere to our Code of Conduct.
  • Public & Author Interaction: To ensure an unbiased review process, program committee members should not discuss submissions with authors and/or the overall public (i.e., please no tweeting). Of course, please feel free to tweet about accepted sessions that you are excited to attend once the schedule has been published.
  • Conflict of Interest: Reviewers are asked to wear their “KubeCon + CloudNativeCon” hats rather than the company or other affiliation when scoring submissions so that you rate all submissions fairly. If a submission was written by a colleague you work closely with or someone that you are seen to be associated with or in competition with, please skip by marking as a conflict of interest.
  • Review Metrics: As listed above, the ranking system is divided into 5 options: 5 (Excellent), 4 (Above Average), 3 (Average), 2 (Below Average), 1 (Poor). It is important that you highlight your level of confidence in your recommendation and the reasons why you gave the score you did. When reviewing proposals, keep in mind the following criteria:
    • Relevance – Does the content provide takeaways that are new and exciting vs information that was “so last year?” Is the content relevant to the conference?
    • Originality – Is this a presentation that is original and not one that a speaker repeats at every conference? Is the way the content is presented original?
    • Soundness – Does the content make sense in delivery or is it all over the place? Does the speaker seem to lack focus?
    • Quality of Presentation – Is the proposal engaging and well thought out? Does the background material suggest the speaker will deliver this presentation effectively?
    • Importance – How important is the content for the KubeCon + CloudNativeCon audience?
    • Experience – Is this speaker a good person to deliver this presentation? Does their experience with the subject matter align with the proposed content?
  • Speakers with multiple submissions: We are unlikely to accept more than one talk from the same speaker. If you are in the position of reviewing more than one strong proposal from the same speaker, you can help the program co-chairs by only giving one of them a response of “yes” when answering the question, “do you see this session being part of the accepted programming for this conference.” Please use your comments to indicate why you prefer one talk over another.
  • Review Comments: Keep in mind that the submitting authors may be a VP at a large company or a university student. Ensure your feedback is constructive, in particular for rejected proposals as we do receive requests for feedback and we may pass on some comments (though we would not associate them with you). Good examples of review elements include:
    • Highlighting the positive aspects of a proposal.
    • Providing constructive feedback, “It would have been helpful if…” and include facts when applicable.
    • Avoid direct attacks “Their YouTube video gives me concerns about their speaking style” rather than “this person is a terrible speaker.”
  • Panel Discussions: The ideal panel is comprised of diverse thought leaders who talk 80% of the time with 20% audience interaction. Some things to keep in mind when reviewing a panel submission:
    • Is the panel diverse, is there a mix of gender on the panel? Note for all KubeCon + CloudNativeCon Events: All panels are required to have at least one speaker that identifies as a woman.
    • Is the submission cohesive and does it provide a clear view of how the panel would progress for 35 minutes? Could they cover everything within the proposal in the given 35 minutes?
    • Have they included any sample questions?
    • Does the panel include panelists from different organizations, including the moderator?
    • Research the panelists and moderator, if needed. Is their experience relevant to the topic?
    • Will the panelists provide diverse perspectives or will they repeat the same thing four times?
    • Are there any high-profile panelists?
    • In the instance that 1-2 of the panelists are unable to attend how would it impact the panel?
  • Breakout Sessions: A presentation is delivered by a topic expert with a fresh or unique point of view. Some things to keep in mind when reviewing presentation proposals:
    • Is the submission well written?
    • Is the topic relevant, original and are they considered to be subject matter experts?
    • Are they talking about a specific product from their company? If so, is it engaging in a way that is not advertorial? Keep in mind that sessions that come across as a pitch or infomercial for their company are most often rated very poorly among the audience.
    • Who is their target audience? Does the abstract and description match up with the expertise required?
  • Birds of a Feather Session (BoFs): A Birds of a Feather session is a discussion forum where there is someone facilitating only 20% of the time with 80% audience interaction. Some things to keep in mind when reviewing BoFs:
    • Is the submission well written?
    • Are they talking about a specific product from their company? If so, is it engaging in a way that is not advertorial? Keep in mind that sessions that come across as a pitch or infomercial for their company are most often rated very poorly among the audience.
    • Who is their target audience? Does the abstract and description match up with the expertise required?
    • Topics where people share experiences make great BoFs. In these sessions, there are no experts but the interaction tends to be high.

CONTACT US

CONTACT US

If you require any assistance reviewing proposals or have questions about the review process or any of the best practices we have suggested, please contact Nanci Lancaster for assistance.


通过实时通讯与我们保持联系 Stay Connected With Our Newsletter

登记以及时了解KubeCon + CloudNativeCon的最新状态,如主题演讲公布,重要的日程表和活动通知,以及专属的场外活动等。

Sign up to be kept up-to-date on the latest developments around KubeCon + CloudNativeCon, like keynote announcements, important schedule and event notifications, exclusive offsite activities, and more.

赞助商 SPONSORS 

战略赞助商 STRATEGIC

Huawei Cloud (English)

双钻赞助商 DOUBLE DIAMOND

Tencent Cloud

钻石赞助商 DIAMOND

Alibaba Cloud
Intel

铂金赞助商 PLATINUM

Rancher
SUSE

黄金赞助商 GOLD

Amazon Web Services
Baidu AI Cloud
Cloudbees
Google Cloud
JD Cloud
Linux Foundation Training
Oracle
Qingcloud
RedHat
VMware

白银赞助商 SILVER

ARM
BoCloud
Canonical
Cloud Foundry
Elastic
F5 Networks
GitLab
JFrog China
Kong
Mesosphere
Pivotal
Portworx
UCloud
Yahoo Japan

初创企业赞助商 START-UP

Ankr
Cloud to Go
EMQ
Giant Swarm
Hyperledger
Kontena
LF Deep Learning
OpenEBS – stacked
PlanetScale
Reduxio
Upbound
Wise2C
Yan Rong Cloud

合作伙伴 PARTNER

China Open Source Cloud League
China OSS Promotion Union (COPU)
ePubit
Juejin
Kaiyuanshe
Katacoda
Linux.cn
Linuxstory
Opening Source
OSCAR
Women Who Code

联系我们 CONTACT US

请在联系我们之前先查看所有的活动页面,因为很多问题的答案都可以在网页上被轻易的找到。如果您找不到问题的答案,并希望给我们发电子邮件,请联系events@cncf.io

Before contacting us, please review all event pages as answers to many questions are readily available throughout this site. If you cannot find the answer to your question and would prefer to email us, please contact events@cncf.io.